Search Results

Search found 3677 results on 148 pages for 'cs 79'.

Page 38/148 | < Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >

  • How can we best petition to bring Adobe creative software to Ubuntu?

    - by Sixthlaw
    Now I know its not as simple as asking for Adobe to support their design software on Ubuntu, but is there a way for the community and Canonical to make known to Adobe the rapidly growing amount of Linux users, and their desire for this great set of tools OFFICIALLY? I know that many of the answers I receive might be of the fashion "Its not going to happen", "use the free tools provided" or "Who knows it might happen in the near future", but this IS not what I am looking for. I have noticed, on sites like www.OmgUbuntu.com, there are links to pages where people can "like" the idea. Is there a way to try and get the whole community on board with this one, even Canonical, and as stated above, put forward this proposal to Adobe. The current requests for an Adobe CS, for Linux, are in dribs and drabs scattered all of the internet. Now is the best time to come up with productive solutions on how we can best gather statistics on the amount of people willing to buy the Adobe CS. These are the words of an Adobe employee: "I have forwarded this feedback on to the appropriate team who will consider it for future releases of Adobe software." The larger amount of people we have unified in the ONE community proposal, the greater chance we have of getting the software. How can we make this happen?

    Read the article

  • Loading images in XNA 4.0; "Cannot Open File" Problems

    - by user32623
    Okay, I'm writing a game in C#/XNA 4.0 and am utterly stumped at my current juncture: Sprite animation. I understand how it works and have all the code in place, but my ContentLoader won't open my file... Basically, my directory looks like this: //WindowsGame1 - "Game1.cs" - //Classes - "NPC.cs" - Content Reference - //Images - "Monster.png" Inside my NPC class, I have all the essential drawing functions, i.e. LoadContent, Draw, Update. And I can get the game to find the correct file and attempt to open it, but when it tries, it throws an exception and tells me it can't open the file. This is how my code in my NPC class looks: Texture2D NPCImage; Vector2 NPCPosition; Animation NPCAnimation = new Animation(); public void Initialize() { NPCAnimation.Initialize(NPCPosition, new Vector2(4, 4)); } public void LoadContent(ContentManager Content) { NPCImage = Content.Load<Texture2D>("_InsertImageFilePathHere_"); NPCAnimation.AnimationImage = NPCImage; } The rest of the code is irrelevant at this point because I can't even get the image to load. I think it might have to do with a directory problem, but I also know little to nothing about spriting or working with images or animations in my code. Any help is appreciated. Not sure if I provided enough information here, so let me know if more is needed! Also, what would be the correct way to direct that Content.Load to Monster.png given the current directory situation? Right now I just have it using the full path from the C:// drive. Thanks in advance!

    Read the article

  • Why is quicksort better than other sorting algorithms in practice?

    - by Raphael
    This is a repost of a question on cs.SE by Janoma. Full credits and spoils to him or cs.SE. In a standard algorithms course we are taught that quicksort is O(n log n) on average and O(n²) in the worst case. At the same time, other sorting algorithms are studied which are O(n log n) in the worst case (like mergesort and heapsort), and even linear time in the best case (like bubblesort) but with some additional needs of memory. After a quick glance at some more running times it is natural to say that quicksort should not be as efficient as others. Also, consider that students learn in basic programming courses that recursion is not really good in general because it could use too much memory, etc. Therefore (and even though this is not a real argument), this gives the idea that quicksort might not be really good because it is a recursive algorithm. Why, then, does quicksort outperform other sorting algorithms in practice? Does it have to do with the structure of real-world data? Does it have to do with the way memory works in computers? I know that some memories are way faster than others, but I don't know if that's the real reason for this counter-intuitive performance (when compared to theoretical estimates).

    Read the article

  • Why does there seem to be a lot of fear in choosing the "wrong" language to learn?

    - by Shewbox
    Perhaps its just me, but as a current CS student I have already come across many questions on this site and elsewhere about not just "Which language should I use for x?" but also "Does anyone still use language Y?" My first CS class was taught in Scheme, which, if I'm not mistaken, isn't used widely (at least in comparison to languages like Java, PHP, Python, etc). Many of my classmates balked at the idea of having to learn a language they would never have to use again, but I don't quite understand where so much of this fear of learning less popular languages comes from. No, I may not use Scheme in any job I get, but I certainly don't regret having learned to use it (albeit in a very beginner, not very in-depth manner in that one semester). I am taking a search engines class this semester, which is done in Perl and again I am seeing classmates complaining about the language choice. I can understand having a favorite language and disliking others but why do some get worked up over learning it in the first place? Can you really learn the "wrong" language? Isn't learning something like Scheme or Haskell good mental exercise if nothing else, and useful at least to exposure to different ways of solving problems?

    Read the article

  • 25 years old and considering a career change...possible? practical?

    - by mq330
    Hi all, I'm new to this site and new to programming as well. I've spent some time going through an intro cs book that uses python as the language of choice. I find the exercises interesting and engaging and I generally have had a favorable experience programming so far. I've gone through some of the basics with python like writing simple programs, basics of GUIs, manipulating strings, lists, defining functions, etc. And I've always loved technology. Although I've never done any real hardcore programming yet, I was inclined to building websites from a very young age but I never really developed my skills. Now, the thing is I'm 25, I have my bacholors in environmental studies and two masters degrees in urban planning and landscape architecture respectively. I know, it would be quite a departure to pursue a career in programming at this point. Currently, I'm working as a geographic information systems intern. I've taken some GIS classes and have a lot of experience with making maps, doing spatial analysis etc. So what I'm thinking is maybe I can learn some solid programming skills and apply these skills in the field of GIS. From what I've seen, .net languages are the norm in this arena. Could you perhaps provide some guidance to me in terms of what languages I should focus on or courses I should take at this point? What about for building web mapping applications? Also, I was thinking about getting a certificate in programming from a university extension program. Do you think it would be worth it? And furthermore, do you think potential employers would be interested in hiring someone like me (once I get a couple of languages down pretty well) as an intern or in an entry level position? I'll be living in the bay area so I feel that there should be decent opportunities even though I don't have a b.s. in cs.

    Read the article

  • How can I tell if I am overusing multi-threading?

    - by exhuma
    NOTE: This is a complete re-write of the question. The text before was way too lengthy and did not get to the point! If you're interested in the original question, you can look it up in the edit history. I currently feel like I am over-using multi-threading. I have 3 types of data, A, B and C. Each A can be converted to multiple Bs and each B can be converted to multiple Cs. I am only interested in treating Cs. I could write this fairly easily with a couple of conversion functions. But I caught myself implementing it with threads, three queues (queue_a, queue_b and queue_c). There are two threads doing the different conversions, and one worker: ConverterA reads from queue_a and writes to queue_b ConverterB reads from queue_b and writes to queue_c Worker handles each element from queue_c The conversions are fairly mundane, and I don't know if this model is too convoluted. But it seems extremely robust to me. Each "converter" can start working even before data has arrived on the queues, and at any time in the code I can just "submit" new As or Bs and it will trigger the conversion pipeline which in turn will trigger a job by the worker thread. Even the resulting code looks simpler. But I still am unsure if I am abusing threads for something simple.

    Read the article

  • Working for a company vs starting my own? [closed]

    - by Mark
    I need some advice, I am considering going to grad school for CS. I have a few big projects I came up with on my own that I am extremely motivated to work on and complete and try to turn it into a career. I am currently completing an internship working for a big company, decent pay, 9-5 hours in an office. I feel like working for the same company many people would enjoy and like, is extremely boring in my opinion and procedural at times and kills my motivation. As a result, I am kind of unsure if I should continue to get my CS M.S. degree and start working for a big company? What I would enjoy doing most is working for myself and developing my own project, but I am not sure if I will be able to finanically support myself doing that and I do not want to miss out on a big opportuinities/ job offers to work for a company. With that being said, I will never know if my project will ever succeed if I don't give it %110 of my time and dedication, so if I decide to go that route and work on my own project, I will have to set everything else aside, If anyone could give me any advice on what they think about my situation?

    Read the article

  • Dataset.ReadXml() - Invalid character in the given encoding

    - by NLV
    Hello all I have saved a dataset in the sql database in an xml column using the following code. XmlDataDocument dd = new XmlDataDocument(dataset); and passing this xml document as sql parameter using param.value = new XmlNodeReader(dd); The XML is like <NewDataSet><SubContractChangeOrders><AGCol>1</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>006</Contract_x0020_Number><ContractID>30</ContractID><ChangeOrderID>211</ChangeOrderID><Amount>0.0000</Amount><Udf_CostReimbursableFlag>false</Udf_CostReimbursableFlag><Udf_CustomerCode /><Udf_SubChangeOrderStatus /></SubContractChangeOrders><SubContractChangeOrders><AGCol>2</AGCol><SCO_x0020_Number>002</SCO_x0020_Number><Contract_x0020_Number>006</Contract_x0020_Number><ContractID>30</ContractID><ChangeOrderID>212</ChangeOrderID><Amount>0.0000</Amount><Udf_CostReimbursableFlag>false</Udf_CostReimbursableFlag></SubContractChangeOrders><SubContractChangeOrders><AGCol>3</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>111</Contract_x0020_Number><ContractID>87</ContractID><ChangeOrderID>12</ChangeOrderID><Amount>300.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>4</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>222</Contract_x0020_Number><ContractID>80</ContractID><ChangeOrderID>6</ChangeOrderID><Amount>100.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>5</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>777</Contract_x0020_Number><ContractID>79</ContractID><ChangeOrderID>5</ChangeOrderID><Amount>200.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>6</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>786</Contract_x0020_Number><ContractID>77</ContractID><ChangeOrderID>3</ChangeOrderID><Amount>100.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>7</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>787</Contract_x0020_Number><ContractID>78</ContractID><ChangeOrderID>4</ChangeOrderID><Amount>500.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>8</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 009</Contract_x0020_Number><ContractID>219</ContractID><ChangeOrderID>78</ChangeOrderID><Amount>9000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>9</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 010</Contract_x0020_Number><ContractID>220</ContractID><ChangeOrderID>79</ChangeOrderID><Amount>13000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>10</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 012</Contract_x0020_Number><ContractID>222</ContractID><ChangeOrderID>83</ChangeOrderID><Amount>2300.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>11</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 020</Contract_x0020_Number><ContractID>226</ContractID><ChangeOrderID>86</ChangeOrderID><Amount>5400.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>12</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 021</Contract_x0020_Number><ContractID>227</ContractID><ChangeOrderID>87</ChangeOrderID><Amount>2300.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>13</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con001</Contract_x0020_Number><ContractID>208</ContractID><ChangeOrderID>72</ChangeOrderID><Amount>3000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>14</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con002</Contract_x0020_Number><ContractID>209</ContractID><ChangeOrderID>73</ChangeOrderID><Amount>400.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>15</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con003</Contract_x0020_Number><ContractID>210</ContractID><ChangeOrderID>74</ChangeOrderID><Amount>6000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>16</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con004</Contract_x0020_Number><ContractID>211</ContractID><ChangeOrderID>75</ChangeOrderID><Amount>9000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>17</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con005</Contract_x0020_Number><ContractID>213</ContractID><ChangeOrderID>76</ChangeOrderID><Amount>17000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>18</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Cont001</Contract_x0020_Number><ContractID>228</ContractID><ChangeOrderID>89</ChangeOrderID><Amount>2000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>19</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>PUR001</Contract_x0020_Number><ContractID>229</ContractID><ChangeOrderID>88</ChangeOrderID><Amount>1000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>20</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>PUR002</Contract_x0020_Number><ContractID>230</ContractID><ChangeOrderID>90</ChangeOrderID><Amount>3000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>21</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>SC-002</Contract_x0020_Number><ContractID>2</ContractID><ChangeOrderID>7</ChangeOrderID><Amount>200.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>22</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>SC-004</Contract_x0020_Number><ContractID>7</ContractID><ChangeOrderID>65</ChangeOrderID><Amount>1000.0000</Amount></SubContractChangeOrders></NewDataSet> I'm trying to read it back as follows using (SqlConnection con = new SqlConnection("Server=#####;Initial Catalog=#####;User ID=####;Pwd=########")) { using (SqlCommand com = new SqlCommand("select * from dbo.tbl_#####", con)) { using (SqlDataAdapter ada = new SqlDataAdapter(com)) { ada.Fill(dt); MemoryStream ms = new MemoryStream(); object contractXML1 = dt.Rows[0]["SCOXML1"]; System.Runtime.Serialization.Formatters.Binary.BinaryFormatter bf = new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter(); bf.Serialize(ms, contractXML1); ms.Seek(0, SeekOrigin.Begin); ds.ReadXml(ms); } } } I'm getting the following error Data at the root level is invalid. Line 1, position 6. Any ideas? NLV

    Read the article

  • Problem deploying portlets on JBoss Portal 2.7.2: Not a canonical value

    - by Dee
    I just downloaded the JBoss Portal Server 2.7.2 (JBoss Portal + JBoss AS 4.2.3 bundle to be precise) and tried deploying portlets just as the SimpleHelloWorld provided in the samples. The portlet gets deployed fine but when I put it on a page I get the following exception. I tried adding other Portlets as well (such as the booking MVC portelt supplied with Spring WebFlow dist) but the same problem happens. The problem happens when the new instances are created by me, example when i create a new instance of CMS Portlet, I get the same error. If I use an existing instance it works. If I deploy a portlet that creates an instance using the "portle-instances.xml" then it works fine, but creating additional instances using the Admin and deploying them on page fails due to the following error. What am I doing wrong? Can anyone please help? javax.servlet.ServletException: java.lang.IllegalArgumentException: Not a canonical value SimplestHelloWorldWindow org.jboss.portal.server.servlet.PortalServlet.service(PortalServlet.java:278) javax.servlet.http.HttpServlet.service(HttpServlet.java:803) org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96) root cause java.lang.IllegalArgumentException: Not a canonical value SimplestHelloWorldWindow org.jboss.portal.core.model.portal.PortalObjectPath$CanonicalFormat.parse(PortalObjectPath.java:357) org.jboss.portal.core.model.portal.PortalObjectPath.<init>(PortalObjectPath.java:161) org.jboss.portal.core.model.portal.PortalObjectPath.parse(PortalObjectPath.java:314) org.jboss.portal.core.model.portal.PortalObjectId.parse(PortalObjectId.java:158) org.jboss.portal.core.model.portal.PortalObjectId.parse(PortalObjectId.java:143) org.jboss.portal.core.model.portal.navstate.PortalObjectNavigationalStateContext.createWindowKey(PortalObjectNavigationalStateContext.java:299) org.jboss.portal.core.model.portal.navstate.PortalObjectNavigationalStateContext.getWindowNavigationalState(PortalObjectNavigationalStateContext.java:194) org.jboss.portal.core.controller.portlet.ControllerPageNavigationalState.getPortletWindowNavigationalState(ControllerPageNavigationalState.java:230) org.jboss.portal.core.model.portal.command.render.RenderWindowCommand.getPortletNavigationalState(RenderWindowCommand.java:121) org.jboss.portal.core.impl.model.content.InternalContentProvider.renderWindow(InternalContentProvider.java:211) org.jboss.portal.core.model.portal.command.render.RenderWindowCommand.execute(RenderWindowCommand.java:100) org.jboss.portal.core.controller.ControllerCommand$1.invoke(ControllerCommand.java:68) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:131) org.jboss.portal.core.aspects.controller.node.EventBroadcasterInterceptor.invoke(EventBroadcasterInterceptor.java:124) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.PageCustomizerInterceptor.invoke(PageCustomizerInterceptor.java:134) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.PolicyEnforcementInterceptor.invoke(PolicyEnforcementInterceptor.java:78) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.node.PortalNodeInterceptor.invoke(PortalNodeInterceptor.java:81) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.BackwardCompatibilityInterceptor.invoke(BackwardCompatibilityInterceptor.java:48) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.ControlInterceptor.invoke(ControlInterceptor.java:56) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.NavigationalStateInterceptor.invoke(NavigationalStateInterceptor.java:42) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.controller.ajax.AjaxInterceptor.invoke(AjaxInterceptor.java:55) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.ResourceAcquisitionInterceptor.invoke(ResourceAcquisitionInterceptor.java:50) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.common.invocation.Invocation.invoke(Invocation.java:157) org.jboss.portal.core.controller.ControllerContext.execute(ControllerContext.java:134) org.jboss.portal.core.model.portal.command.render.RenderWindowCommand.render(RenderWindowCommand.java:80) org.jboss.portal.core.model.portal.command.render.RenderPageCommand.execute(RenderPageCommand.java:222) org.jboss.portal.core.controller.ControllerCommand$1.invoke(ControllerCommand.java:68) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:131) org.jboss.portal.core.aspects.controller.node.EventBroadcasterInterceptor.invoke(EventBroadcasterInterceptor.java:124) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.PageCustomizerInterceptor.invoke(PageCustomizerInterceptor.java:134) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.PolicyEnforcementInterceptor.invoke(PolicyEnforcementInterceptor.java:78) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.node.PortalNodeInterceptor.invoke(PortalNodeInterceptor.java:81) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.BackwardCompatibilityInterceptor.invoke(BackwardCompatibilityInterceptor.java:48) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.ControlInterceptor.invoke(ControlInterceptor.java:56) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.NavigationalStateInterceptor.invoke(NavigationalStateInterceptor.java:42) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.controller.ajax.AjaxInterceptor.invoke(AjaxInterceptor.java:55) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.ResourceAcquisitionInterceptor.invoke(ResourceAcquisitionInterceptor.java:50) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.common.invocation.Invocation.invoke(Invocation.java:157) org.jboss.portal.core.controller.ControllerContext.execute(ControllerContext.java:134) org.jboss.portal.core.model.portal.PortalObjectResponseHandler.processCommandResponse(PortalObjectResponseHandler.java:80) org.jboss.portal.core.controller.classic.ClassicResponseHandler.processHandlers(ClassicResponseHandler.java:78) org.jboss.portal.core.controller.classic.ClassicResponseHandler.processCommandResponse(ClassicResponseHandler.java:53) org.jboss.portal.core.controller.handler.ResponseHandlerSelector.processCommandResponse(ResponseHandlerSelector.java:70) org.jboss.portal.core.controller.Controller.processCommandResponse(Controller.java:315) org.jboss.portal.core.controller.Controller.processCommand(Controller.java:303) org.jboss.portal.core.controller.Controller.handle(Controller.java:261) org.jboss.portal.server.RequestControllerDispatcher.invoke(RequestControllerDispatcher.java:51) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:131) org.jboss.portal.core.cms.aspect.IdentityBindingInterceptor.invoke(IdentityBindingInterceptor.java:47) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.server.aspects.server.ContentTypeInterceptor.invoke(ContentTypeInterceptor.java:68) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.server.PortalContextPathInterceptor.invoke(PortalContextPathInterceptor.java:45) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.server.LocaleInterceptor.invoke(LocaleInterceptor.java:96) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.server.UserInterceptor.invoke(UserInterceptor.java:196) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.server.aspects.server.SignOutInterceptor.invoke(SignOutInterceptor.java:98) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.impl.api.user.UserEventBridgeTriggerInterceptor.invoke(UserEventBridgeTriggerInterceptor.java:65) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.server.IdentityCacheInterceptor.invoke(IdentityCacheInterceptor.java:68) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.server.TransactionInterceptor.org$jboss$portal$core$aspects$server$TransactionInterceptor$invoke$aop(TransactionInterceptor.java:49) org.jboss.portal.core.aspects.server.TransactionInterceptor$invoke_N5143606530999904530.invokeNext(TransactionInterceptor$invoke_N5143606530999904530.java) org.jboss.aspects.tx.TxPolicy.invokeInOurTx(TxPolicy.java:79) org.jboss.aspects.tx.TxInterceptor$RequiresNew.invoke(TxInterceptor.java:253) org.jboss.portal.core.aspects.server.TransactionInterceptor$invoke_N5143606530999904530.invokeNext(TransactionInterceptor$invoke_N5143606530999904530.java) org.jboss.aspects.tx.TxPolicy.invokeInOurTx(TxPolicy.java:79) org.jboss.aspects.tx.TxInterceptor$RequiresNew.invoke(TxInterceptor.java:262) org.jboss.portal.core.aspects.server.TransactionInterceptor$invoke_N5143606530999904530.invokeNext(TransactionInterceptor$invoke_N5143606530999904530.java) org.jboss.portal.core.aspects.server.TransactionInterceptor.invoke(TransactionInterceptor.java) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.server.aspects.LockInterceptor$InternalLock.invoke(LockInterceptor.java:69) org.jboss.portal.server.aspects.LockInterceptor.invoke(LockInterceptor.java:130) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.common.invocation.Invocation.invoke(Invocation.java:157) org.jboss.portal.server.servlet.PortalServlet.service(PortalServlet.java:252) javax.servlet.http.HttpServlet.service(HttpServlet.java:803) org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96)

    Read the article

  • Why does fprintf start printing out of order or not at all?

    - by Steve Melvin
    This code should take an integer, create pipes, spawn two children, wait until they are dead, and start all over again. However, around the third time around the loop I lose my prompt to enter a number and it no longer prints the number I've entered. Any ideas? #include <stdio.h> #include <stdlib.h> #include <unistd.h> #include <errno.h> #define WRITE 1 #define READ 0 int main (int argc, const char * argv[]) { //Pipe file-descriptor array unsigned int isChildA = 0; int pipeA[2]; int pipeB[2]; int num = 0; while(1){ fprintf(stderr,"Enter an integer: "); scanf("%i", &num); if(num == 0){ fprintf(stderr,"You entered zero, exiting...\n"); exit(0); } //Open Pipes if(pipe(pipeA) < 0){ fprintf(stderr,"Could not create pipe A.\n"); exit(1); } if(pipe(pipeB) < 0){ fprintf(stderr,"Could not create pipe B.\n"); exit(1); } fprintf(stderr,"Value read: %i \n", num); fprintf(stderr,"Parent PID: %i\n", getpid()); pid_t procID = fork(); switch (procID) { case -1: fprintf(stderr,"Fork error, quitting...\n"); exit(1); break; case 0: isChildA = 1; break; default: procID = fork(); if (procID<0) { fprintf(stderr,"Fork error, quitting...\n"); exit(1); } else if(procID == 0){ isChildA = 0; } else { write(pipeA[WRITE], &num, sizeof(int)); close(pipeA[WRITE]); close(pipeA[READ]); close(pipeB[WRITE]); close(pipeB[READ]); pid_t pid; while (pid = waitpid(-1, NULL, 0)) { if (errno == ECHILD) { break; } } } break; } if (procID == 0) { //We're a child, do kid-stuff. ssize_t bytesRead = 0; int response; while (1) { while (bytesRead == 0) { bytesRead = read((isChildA?pipeA[READ]:pipeB[READ]), &response, sizeof(int)); } if (response < 2) { //Kill other child and self fprintf(stderr, "Terminating PROCID: %i\n", getpid()); write((isChildA?pipeB[WRITE]:pipeA[WRITE]), &response, sizeof(int)); close(pipeA[WRITE]); close(pipeA[READ]); close(pipeB[WRITE]); close(pipeB[READ]); return 0; } else if(!(response%2)){ //Even response/=2; fprintf(stderr,"PROCID: %i, VALUE: %i\n", getpid(), response); write((isChildA?pipeB[WRITE]:pipeA[WRITE]), &response, sizeof(int)); bytesRead = 0; } else { //Odd response*=3; response++; fprintf(stderr,"PROCID: %i, VALUE: %i\n", getpid(), response); write((isChildA?pipeB[WRITE]:pipeA[WRITE]), &response, sizeof(int)); bytesRead = 0; } } } } return 0; } This is the output I am getting... bash-3.00$ ./proj2 Enter an integer: 101 Value read: 101 Parent PID: 9379 PROCID: 9380, VALUE: 304 PROCID: 9381, VALUE: 152 PROCID: 9380, VALUE: 76 PROCID: 9381, VALUE: 38 PROCID: 9380, VALUE: 19 PROCID: 9381, VALUE: 58 PROCID: 9380, VALUE: 29 PROCID: 9381, VALUE: 88 PROCID: 9380, VALUE: 44 PROCID: 9381, VALUE: 22 PROCID: 9380, VALUE: 11 PROCID: 9381, VALUE: 34 PROCID: 9380, VALUE: 17 PROCID: 9381, VALUE: 52 PROCID: 9380, VALUE: 26 PROCID: 9381, VALUE: 13 PROCID: 9380, VALUE: 40 PROCID: 9381, VALUE: 20 PROCID: 9380, VALUE: 10 PROCID: 9381, VALUE: 5 PROCID: 9380, VALUE: 16 PROCID: 9381, VALUE: 8 PROCID: 9380, VALUE: 4 PROCID: 9381, VALUE: 2 PROCID: 9380, VALUE: 1 Terminating PROCID: 9381 Terminating PROCID: 9380 Enter an integer: 102 Value read: 102 Parent PID: 9379 PROCID: 9386, VALUE: 51 PROCID: 9387, VALUE: 154 PROCID: 9386, VALUE: 77 PROCID: 9387, VALUE: 232 PROCID: 9386, VALUE: 116 PROCID: 9387, VALUE: 58 PROCID: 9386, VALUE: 29 PROCID: 9387, VALUE: 88 PROCID: 9386, VALUE: 44 PROCID: 9387, VALUE: 22 PROCID: 9386, VALUE: 11 PROCID: 9387, VALUE: 34 PROCID: 9386, VALUE: 17 PROCID: 9387, VALUE: 52 PROCID: 9386, VALUE: 26 PROCID: 9387, VALUE: 13 PROCID: 9386, VALUE: 40 PROCID: 9387, VALUE: 20 PROCID: 9386, VALUE: 10 PROCID: 9387, VALUE: 5 PROCID: 9386, VALUE: 16 PROCID: 9387, VALUE: 8 PROCID: 9386, VALUE: 4 PROCID: 9387, VALUE: 2 PROCID: 9386, VALUE: 1 Terminating PROCID: 9387 Terminating PROCID: 9386 Enter an integer: 104 Value read: 104 Parent PID: 9379 Enter an integer: PROCID: 9388, VALUE: 52 PROCID: 9389, VALUE: 26 PROCID: 9388, VALUE: 13 PROCID: 9389, VALUE: 40 PROCID: 9388, VALUE: 20 PROCID: 9389, VALUE: 10 PROCID: 9388, VALUE: 5 PROCID: 9389, VALUE: 16 PROCID: 9388, VALUE: 8 PROCID: 9389, VALUE: 4 PROCID: 9388, VALUE: 2 PROCID: 9389, VALUE: 1 Terminating PROCID: 9388 Terminating PROCID: 9389 105 Value read: 105 Parent PID: 9379 Enter an integer: PROCID: 9395, VALUE: 316 PROCID: 9396, VALUE: 158 PROCID: 9395, VALUE: 79 PROCID: 9396, VALUE: 238 PROCID: 9395, VALUE: 119 PROCID: 9396, VALUE: 358 PROCID: 9395, VALUE: 179 PROCID: 9396, VALUE: 538 PROCID: 9395, VALUE: 269 PROCID: 9396, VALUE: 808 PROCID: 9395, VALUE: 404 PROCID: 9396, VALUE: 202 PROCID: 9395, VALUE: 101 PROCID: 9396, VALUE: 304 PROCID: 9395, VALUE: 152 PROCID: 9396, VALUE: 76 PROCID: 9395, VALUE: 38 PROCID: 9396, VALUE: 19 PROCID: 9395, VALUE: 58 PROCID: 9396, VALUE: 29 PROCID: 9395, VALUE: 88 PROCID: 9396, VALUE: 44 PROCID: 9395, VALUE: 22 PROCID: 9396, VALUE: 11 PROCID: 9395, VALUE: 34 PROCID: 9396, VALUE: 17 PROCID: 9395, VALUE: 52 PROCID: 9396, VALUE: 26 PROCID: 9395, VALUE: 13 PROCID: 9396, VALUE: 40 PROCID: 9395, VALUE: 20 PROCID: 9396, VALUE: 10 PROCID: 9395, VALUE: 5 PROCID: 9396, VALUE: 16 PROCID: 9395, VALUE: 8 PROCID: 9396, VALUE: 4 PROCID: 9395, VALUE: 2 PROCID: 9396, VALUE: 1 Terminating PROCID: 9395 Terminating PROCID: 9396 105 Value read: 105 Parent PID: 9379 Enter an integer: PROCID: 9397, VALUE: 316 PROCID: 9398, VALUE: 158 PROCID: 9397, VALUE: 79 PROCID: 9398, VALUE: 238 PROCID: 9397, VALUE: 119 PROCID: 9398, VALUE: 358 PROCID: 9397, VALUE: 179 PROCID: 9398, VALUE: 538 PROCID: 9397, VALUE: 269 PROCID: 9398, VALUE: 808 PROCID: 9397, VALUE: 404 PROCID: 9398, VALUE: 202 PROCID: 9397, VALUE: 101 PROCID: 9398, VALUE: 304 PROCID: 9397, VALUE: 152 PROCID: 9398, VALUE: 76 PROCID: 9397, VALUE: 38 PROCID: 9398, VALUE: 19 PROCID: 9397, VALUE: 58 PROCID: 9398, VALUE: 29 PROCID: 9397, VALUE: 88 PROCID: 9398, VALUE: 44 PROCID: 9397, VALUE: 22 PROCID: 9398, VALUE: 11 PROCID: 9397, VALUE: 34 PROCID: 9398, VALUE: 17 PROCID: 9397, VALUE: 52 PROCID: 9398, VALUE: 26 PROCID: 9397, VALUE: 13 PROCID: 9398, VALUE: 40 PROCID: 9397, VALUE: 20 PROCID: 9398, VALUE: 10 PROCID: 9397, VALUE: 5 PROCID: 9398, VALUE: 16 PROCID: 9397, VALUE: 8 PROCID: 9398, VALUE: 4 PROCID: 9397, VALUE: 2 PROCID: 9398, VALUE: 1 Terminating PROCID: 9397 Terminating PROCID: 9398 106 Value read: 106 Parent PID: 9379 Enter an integer: PROCID: 9399, VALUE: 53 PROCID: 9400, VALUE: 160 PROCID: 9399, VALUE: 80 PROCID: 9400, VALUE: 40 PROCID: 9399, VALUE: 20 PROCID: 9400, VALUE: 10 PROCID: 9399, VALUE: 5 PROCID: 9400, VALUE: 16 PROCID: 9399, VALUE: 8 PROCID: 9400, VALUE: 4 PROCID: 9399, VALUE: 2 PROCID: 9400, VALUE: 1 Terminating PROCID: 9399 Terminating PROCID: 9400 ^C Another thing that's strange, when ran from within XCode it behaves normally. However, when ran from bash on Solaris or OSX it acts up.

    Read the article

  • Python regex to parse text file, get the items in list and count the list

    - by Nemo
    I have a text file which contains some data. I m particularly interested in finding the count of the number of items in v_dims v_dims pattern in my text file looks like this : v_dims={ "Sales", "Product Family", "Sales Organization", "Region", "Sales Area", "Sales office", "Sales Division", "Sales Person", "Sales Channel", "Sales Order Type", "Sales Number", "Sales Person", "Sales Quantity", "Sales Amount" } So I m thinking of getting all the elements in v_dims and dumping them out in a Python list. Then compute the len(mylist) to get the count of the items. The challenge is in getting all the elements of v_dims from my text file and putting them in an empty list. I m particularly interested in items in v_dims in my text file. The text file has data in the form of v_dims pattern i showed in my original post. Some data has nested patterns of v_dims. Thanks. Here's what I have tried and failed. Any help is appreciated. TIA. import re fname = "C:\Users\XXXX\Test.mrk" with open(fname, "r") as fo: content_as_string = fo.read() match = re.findall(r'v_dims={\"(.+?)\"}',content_as_string) Though I have a big text file, Here's a snippet of what's the structure of my text file version "1"; // Computer generated object language file object 'MRKR' "Main" { Data_Type=2, HeaderBlock={ Version_String="6.3 (25)" }, Printer_Info={ Orientation=0, Page_Width=8.50000000, Page_Height=11.00000000, Page_Header="", Page_Footer="", Margin_type=0, Top_Margin=0.50000000, Left_Margin=0.50000000, Bottom_Margin=0.50000000, Right_Margin=0.50000000 }, Marker_Options={ Close_All="TRUE", Hide_Console="FALSE", Console_Left="FALSE", Console_Width=217, Main_Style="Maximized", MDI_Rect={ 0, 0, 892, 1063 } }, Dives={ { Dive="A", Windows={ { View_Index=0, Window_Info={ Window_Rect={ 0, -288, 400, 1008 }, Window_Style="Maximized Front", Window_Name="Theater [Previous Qtr Diveplan-Dive A]" }, Dependent_bool="FALSE", Colset={ Dive_Type="Normal", Dimension_Name="Theater", Action_List={ Actions={ { Action_Type="Select", select_type=5 }, { Action_Type="Select", select_type=0, Key_Names={ "Theater" }, Key_Indexes={ { "AMERICAS" } } }, { Action_Type="Focus", Focus_Rows="True" }, { Action_Type="Dimensions", v_dims={ "Theater", "Product Family", "Division", "Region", "Install at Country Name", "Connect Home Type", "Connect In Type", "SymmConnect Enabled", "Connect Home Refusal Reason", "Sales Order Channel Type", "Maintained By Group", "PS Flag", "Avalanche Flag", "Product Item Family" }, Xtab_Bool="False", Xtab_Flip="False" }, { Action_Type="Select", select_type=5 }, { Action_Type="Select", select_type=0, Key_Names={ "Theater", "Product Family", "Division", "Region", "Install at Country Name", "Connect Home Type", "Connect In Type", "SymmConnect Enabled", "Connect Home Refusal Reason", "Sales Order Channel Type", "Maintained By Group", "PS Flag", "Avalanche Flag" }, Key_Indexes={ { "AMERICAS", "ATMOS", "Latin America CS Division", "37000 CS Region", "Mexico", "", "", "", "", "DIRECT", "EMC", "N", "0" } } } } }, Num_Palette_cols=0, Num_Palette_rows=0 }, Format={ Window_Type="Tabular", Tabular={ Num_row_labels=8 } } } } } }, Widget_Set={ Widget_Layout="Vertical", Go_Button=1, Picklist_Width=0, Sort_Subset_Dimensions="TRUE", Order={ } }, Views={ { Data_Type=1, dbname="Previous Qtr Diveplan", diveline_dbname="Current Qtr Diveplan", logical_name="Current Qtr Diveplan", cols={ { name="Total TSS installs", column_type="Calc[Total TSS installs]", output_type="Number", format_string="." }, { name="TSS Valid Connectivity Records", column_type="Calc[TSS Valid Connectivity Records]", output_type="Number", format_string="." }, { name="% TSS Connectivity Record", column_type="Calc[% TSS Connectivity Record]", output_type="Number" }, { name="TSS Not Applicable", column_type="Calc[TSS Not Applicable]", output_type="Number", format_string="." }, { name="TSS Customer Refusals", column_type="Calc[TSS Customer Refusals]", output_type="Number", format_string="." }, { name="% TSS Refusals", column_type="Calc[% TSS Refusals]", output_type="Number" }, { name="TSS Eligible for Physical Connectivity", column_type="Calc[TSS Eligible for Physical Connectivity]", output_type="Number", format_string="." }, { name="TSS Boxes with Physical Connectivty", column_type="Calc[TSS Boxes with Physical Connectivty]", output_type="Number", format_string="." }, { name="% TSS Physical Connectivity", column_type="Calc[% TSS Physical Connectivity]", output_type="Number" } }, dim_cols={ { name="Model", column_type="Dimension[Model]", output_type="None" }, { name="Model", column_type="Dimension[Model]", output_type="None" }, { name="Connect In Type", column_type="Dimension[Connect In Type]", output_type="None" }, { name="Connect Home Type", column_type="Dimension[Connect Home Type]", output_type="None" }, { name="SymmConnect Enabled", column_type="Dimension[SymmConnect Enabled]", output_type="None" }, { name="Theater", column_type="Dimension[Theater]", output_type="None" }, { name="Division", column_type="Dimension[Division]", output_type="None" }, { name="Region", column_type="Dimension[Region]", output_type="None" }, { name="Sales Order Number", column_type="Dimension[Sales Order Number]", output_type="None" }, { name="Product Item Family", column_type="Dimension[Product Item Family]", output_type="None" }, { name="Item Serial Number", column_type="Dimension[Item Serial Number]", output_type="None" }, { name="Sales Order Deal Number", column_type="Dimension[Sales Order Deal Number]", output_type="None" }, { name="Item Install Date", column_type="Dimension[Item Install Date]", output_type="None" }, { name="SYR Last Dial Home Date", column_type="Dimension[SYR Last Dial Home Date]", output_type="None" }, { name="Maintained By Group", column_type="Dimension[Maintained By Group]", output_type="None" }, { name="PS Flag", column_type="Dimension[PS Flag]", output_type="None" }, { name="Connect Home Refusal Reason", column_type="Dimension[Connect Home Refusal Reason]", output_type="None", col_width=177 }, { name="Cust Name", column_type="Dimension[Cust Name]", output_type="None" }, { name="Sales Order Channel Type", column_type="Dimension[Sales Order Channel Type]", output_type="None" }, { name="Sales Order Type", column_type="Dimension[Sales Order Type]", output_type="None" }, { name="Part Model Key", column_type="Dimension[Part Model Key]", output_type="None" }, { name="Ship Date", column_type="Dimension[Ship Date]", output_type="None" }, { name="Model Number", column_type="Dimension[Model Number]", output_type="None" }, { name="Item Description", column_type="Dimension[Item Description]", output_type="None" }, { name="Customer Classification", column_type="Dimension[Customer Classification]", output_type="None" }, { name="CS Customer Name", column_type="Dimension[CS Customer Name]", output_type="None" }, { name="Install At Customer Number", column_type="Dimension[Install At Customer Number]", output_type="None" }, { name="Install at Country Name", column_type="Dimension[Install at Country Name]", output_type="None" }, { name="TLA Serial Number", column_type="Dimension[TLA Serial Number]", output_type="None" }, { name="Product Version", column_type="Dimension[Product Version]", output_type="None" }, { name="Avalanche Flag", column_type="Dimension[Avalanche Flag]", output_type="None" }, { name="Product Family", column_type="Dimension[Product Family]", output_type="None" }, { name="Project Number", column_type="Dimension[Project Number]", output_type="None" }, { name="PROJECT_STATUS", column_type="Dimension[PROJECT_STATUS]", output_type="None" } }, Available_Columns={ "Total TSS installs", "TSS Valid Connectivity Records", "% TSS Connectivity Record", "TSS Not Applicable", "TSS Customer Refusals", "% TSS Refusals", "TSS Eligible for Physical Connectivity", "TSS Boxes with Physical Connectivty", "% TSS Physical Connectivity", "Total Installs", "All Boxes with Valid Connectivty Record", "% All Connectivity Record", "Overall Refusals", "Overall Refusals %", "All Eligible for Physical Connectivty", "Boxes with Physical Connectivity", "% All with Physical Conectivity" }, Remaining_columns={ { name="Total Installs", column_type="Calc[Total Installs]", output_type="Number", format_string="." }, { name="All Boxes with Valid Connectivty Record", column_type="Calc[All Boxes with Valid Connectivty Record]", output_type="Number", format_string="." }, { name="% All Connectivity Record", column_type="Calc[% All Connectivity Record]", output_type="Number" }, { name="Overall Refusals", column_type="Calc[Overall Refusals]", output_type="Number", format_string="." }, { name="Overall Refusals %", column_type="Calc[Overall Refusals %]", output_type="Number" }, { name="All Eligible for Physical Connectivty", column_type="Calc[All Eligible for Physical Connectivty]", output_type="Number" }, { name="Boxes with Physical Connectivity", column_type="Calc[Boxes with Physical Connectivity]", output_type="Number" }, { name="% All with Physical Conectivity", column_type="Calc[% All with Physical Conectivity]", output_type="Number" } }, calcs={ { name="Total TSS installs", definition="Total[Total TSS installs]", ts_flag="Not TS Calc" }, { name="TSS Valid Connectivity Records", definition="Total[PS Boxes w/ valid connectivity record (1=yes)]", ts_flag="Not TS Calc" }, { name="% TSS Connectivity Record", definition="Total[PS Boxes w/ valid connectivity record (1=yes)] /Total[Total TSS installs]", ts_flag="Not TS Calc" }, { name="TSS Not Applicable", definition="Total[Bozes w/ valid connectivity record (1=yes)]-Total[Boxes Eligible (1=yes)]-Total[TSS Refusals]", ts_flag="Not TS Calc" }, { name="TSS Customer Refusals", definition="Total[TSS Refusals]", ts_flag="Not TS Calc" }, { name="% TSS Refusals", definition="Total[TSS Refusals]/Total[PS Boxes w/ valid connectivity record (1=yes)]", ts_flag="Not TS Calc" }, { name="TSS Eligible for Physical Connectivity", definition="Total[TSS Eligible]-Total[Exception]", ts_flag="Not TS Calc" }, { name="TSS Boxes with Physical Connectivty", definition="Total[PS Physical Connectivity] - Total[PS Physical Connectivity, SymmConnect Enabled=\"Capable not enabled\"]", ts_flag="Not TS Calc" }, { name="% TSS Physical Connectivity", definition="Total[Boxes w/ phys conn]/Total[Boxes Eligible (1=yes)]", ts_flag="Not TS Calc" }, { name="Total Installs", definition="Total[Total Installs]", ts_flag="Not TS Calc" }, { name="All Boxes with Valid Connectivty Record", definition="Total[Bozes w/ valid connectivity record (1=yes)]", ts_flag="Not TS Calc" }, { name="% All Connectivity Record", definition="Total[Bozes w/ valid connectivity record (1=yes)]/Total[Total Installs]", ts_flag="Not TS Calc" }, { name="Overall Refusals", definition="Total[Overall Refusals]", ts_flag="Not TS Calc" }, { name="Overall Refusals %", definition="Total[Overall Refusals]/Total[Bozes w/ valid connectivity record (1=yes)]", ts_flag="Not TS Calc" }, { name="All Eligible for Physical Connectivty", definition="Total[Boxes Eligible (1=yes)]-Total[Exception]", ts_flag="Not TS Calc" }, { name="Boxes with Physical Connectivity", definition="Total[Boxes w/ phys conn]-Total[Boxes w/ phys conn,SymmConnect Enabled=\"Capable not enabled\"]", ts_flag="Not TS Calc" }, { name="% All with Physical Conectivity", definition="Total[Boxes w/ phys conn]/Total[Boxes Eligible (1=yes)]", ts_flag="Not TS Calc" } }, merge_type="consolidate", merge_dbs={ { dbname="connectivityallproducts.mdl", diveline_dbname="/DI_PSREPORTING/connectivityallproducts.mdl" } }, skip_constant_columns="FALSE", categories={ { name="Geography", dimensions={ "Theater", "Division", "Region", "Install at Country Name" } }, { name="Mappings and Flags", dimensions={ "Connect Home Type", "Connect In Type", "SymmConnect Enabled", "Connect Home Refusal Reason", "Sales Order Channel Type", "Maintained By Group", "Customer Installable", "PS Flag", "Top Level Flag", "Avalanche Flag" } }, { name="Product Information", dimensions={ "Product Family", "Product Item Family", "Product Version", "Item Description" } }, { name="Sales Order Info", dimensions={ "Sales Order Deal Number", "Sales Order Number", "Sales Order Type" } }, { name="Dates", dimensions={ "Item Install Date", "Ship Date", "SYR Last Dial Home Date" } }, { name="Details", dimensions={ "Item Serial Number", "TLA Serial Number", "Part Model Key", "Model Number" } }, { name="Customer Infor", dimensions={ "CS Customer Name", "Install At Customer Number", "Customer Classification", "Cust Name" } }, { name="Other Dimensions", dimensions={ "Model" } } }, Maintain_Category_Order="FALSE", popup_info="false" } } };

    Read the article

  • "PGError: no connection to the server" after idle

    - by user573484
    After my application sits idle overnight, when I try to access it in the morning I get a 500 Internal server error and the logs indicate "PGError: no connection to the server". After this first request if I refresh the page again everything is fine. I'm running Ubuntu 10.04 with apache 2, Passenger 3.0.2, Rails 2.3.8, and Postgres 8.4 on a remote server. Any ideas how to fix this? Here is the log: Processing ApplicationController#index (for 192.168.1.33 at 2011-01-06 17:28:14) [GET] Parameters: {"action"=>"index", "controller"=>"da"} ActiveRecord::StatementInvalid (PGError: no connection to the server : SELECT * FROM "users" WHERE ("users"."id" = 1) LIMIT 1): app/controllers/application_controller.rb:47:in `current_user' app/controllers/application_controller.rb:51:in `set_current_user' app/controllers/application_controller.rb:123:in `render_optional_error_file' passenger (3.0.2) lib/phusion_passenger/rack/request_handler.rb:96:in `process_request' passenger (3.0.2) lib/phusion_passenger/abstract_request_handler.rb:513:in `accept_and_process_next_request' passenger (3.0.2) lib/phusion_passenger/abstract_request_handler.rb:274:in `main_loop' passenger (3.0.2) lib/phusion_passenger/classic_rails/application_spawner.rb:321:in `start_request_handler' passenger (3.0.2) lib/phusion_passenger/classic_rails/application_spawner.rb:275:in `send' passenger (3.0.2) lib/phusion_passenger/classic_rails/application_spawner.rb:275:in `handle_spawn_application' passenger (3.0.2) lib/phusion_passenger/utils.rb:479:in `safe_fork' passenger (3.0.2) lib/phusion_passenger/classic_rails/application_spawner.rb:270:in `handle_spawn_application' passenger (3.0.2) lib/phusion_passenger/abstract_server.rb:357:in `__send__' passenger (3.0.2) lib/phusion_passenger/abstract_server.rb:357:in `server_main_loop' passenger (3.0.2) lib/phusion_passenger/abstract_server.rb:206:in `start_synchronously' passenger (3.0.2) lib/phusion_passenger/abstract_server.rb:180:in `start' passenger (3.0.2) lib/phusion_passenger/classic_rails/application_spawner.rb:149:in `start' passenger (3.0.2) lib/phusion_passenger/spawn_manager.rb:219:in `spawn_rails_application' passenger (3.0.2) lib/phusion_passenger/abstract_server_collection.rb:132:in `lookup_or_add' passenger (3.0.2) lib/phusion_passenger/spawn_manager.rb:214:in `spawn_rails_application' passenger (3.0.2) lib/phusion_passenger/abstract_server_collection.rb:82:in `synchronize' passenger (3.0.2) lib/phusion_passenger/abstract_server_collection.rb:79:in `synchronize' passenger (3.0.2) lib/phusion_passenger/spawn_manager.rb:213:in `spawn_rails_application' passenger (3.0.2) lib/phusion_passenger/spawn_manager.rb:132:in `spawn_application' passenger (3.0.2) lib/phusion_passenger/spawn_manager.rb:275:in `handle_spawn_application' passenger (3.0.2) lib/phusion_passenger/abstract_server.rb:357:in `__send__' passenger (3.0.2) lib/phusion_passenger/abstract_server.rb:357:in `server_main_loop' passenger (3.0.2) lib/phusion_passenger/abstract_server.rb:206:in `start_synchronously' passenger (3.0.2) helper-scripts/passenger-spawn-server:99 /!\ FAILSAFE /!\ Thu Jan 06 17:28:14 -0700 2011 Status: 500 Internal Server Error PGError: no connection to the server : SELECT * FROM "users" WHERE ("users"."id" = 1) LIMIT 1 /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/connection_adapters/abstract_adapter.rb:221:in `log' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/connection_adapters/postgresql_adapter.rb:520:in `execute' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/connection_adapters/postgresql_adapter.rb:1002:in `select_raw' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/connection_adapters/postgresql_adapter.rb:989:in `select' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/connection_adapters/abstract/database_statements.rb:7:in `select_all_without_query_cache' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/connection_adapters/abstract/query_cache.rb:60:in `select_all' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/connection_adapters/abstract/query_cache.rb:81:in `cache_sql' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/connection_adapters/abstract/query_cache.rb:60:in `select_all' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/base.rb:664:in `find_by_sql' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/base.rb:1578:in `find_every' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/base.rb:1535:in `find_initial' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/base.rb:616:in `find' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/base.rb:1910:in `find_by_id' /home/user/application/releases/20110106230903/app/controllers/application_controller.rb:47:in `current_user' /home/user/application/releases/20110106230903/app/controllers/application_controller.rb:51:in `set_current_user' /home/user/application/releases/20110106230903/app/controllers/application_controller.rb:123:in `render_optional_error_file' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/rescue.rb:97:in `rescue_action_in_public' /home/user/application/releases/20110106230903/vendor/plugins/exception_notification/lib/exception_notification/notifiable.rb:48:in `rescue_action_in_public' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/rescue.rb:154:in `rescue_action_without_handler' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/rescue.rb:74:in `rescue_action' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/base.rb:532:in `send' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/base.rb:532:in `process_without_filters' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/filters.rb:606:in `process' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/rescue.rb:65:in `call_with_exception' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/dispatcher.rb:90:in `dispatch' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/dispatcher.rb:121:in `_call' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/dispatcher.rb:130 /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/query_cache.rb:29:in `call' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/query_cache.rb:29:in `call' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/connection_adapters/abstract/query_cache.rb:34:in `cache' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/query_cache.rb:9:in `cache' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/query_cache.rb:28:in `call' /var/lib/gems/1.8/gems/activerecord-2.3.8/lib/active_record/connection_adapters/abstract/connection_pool.rb:361:in `call' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/string_coercion.rb:25:in `call' /var/lib/gems/1.8/gems/rack-1.1.0/lib/rack/head.rb:9:in `call' /var/lib/gems/1.8/gems/rack-1.1.0/lib/rack/methodoverride.rb:24:in `call' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/params_parser.rb:15:in `call' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/session/cookie_store.rb:99:in `call' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/failsafe.rb:26:in `call' /var/lib/gems/1.8/gems/rack-1.1.0/lib/rack/lock.rb:11:in `call' /var/lib/gems/1.8/gems/rack-1.1.0/lib/rack/lock.rb:11:in `synchronize' /var/lib/gems/1.8/gems/rack-1.1.0/lib/rack/lock.rb:11:in `call' /var/lib/gems/1.8/gems/actionpack-2.3.8/lib/action_controller/dispatcher.rb:106:in `call' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/rack/request_handler.rb:96:in `process_request' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/abstract_request_handler.rb:513:in `accept_and_process_next_request' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/abstract_request_handler.rb:274:in `main_loop' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/classic_rails/application_spawner.rb:321:in `start_request_handler' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/classic_rails/application_spawner.rb:275:in `send' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/classic_rails/application_spawner.rb:275:in `handle_spawn_application' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/utils.rb:479:in `safe_fork' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/classic_rails/application_spawner.rb:270:in `handle_spawn_application' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/abstract_server.rb:357:in `__send__' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/abstract_server.rb:357:in `server_main_loop' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/abstract_server.rb:206:in `start_synchronously' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/abstract_server.rb:180:in `start' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/classic_rails/application_spawner.rb:149:in `start' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/spawn_manager.rb:219:in `spawn_rails_application' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/abstract_server_collection.rb:132:in `lookup_or_add' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/spawn_manager.rb:214:in `spawn_rails_application' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/abstract_server_collection.rb:82:in `synchronize' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/abstract_server_collection.rb:79:in `synchronize' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/spawn_manager.rb:213:in `spawn_rails_application' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/spawn_manager.rb:132:in `spawn_application' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/spawn_manager.rb:275:in `handle_spawn_application' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/abstract_server.rb:357:in `__send__' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/abstract_server.rb:357:in `server_main_loop' /var/lib/gems/1.8/gems/passenger-3.0.2/lib/phusion_passenger/abstract_server.rb:206:in `start_synchronously' /var/lib/gems/1.8/gems/passenger-3.0.2/helper-scripts/passenger-spawn-server:99

    Read the article

  • auto focus camera on model

    - by bob
    I have the following model <MeshGeometry3D x:Key="SphereOR10GR13" Positions="121.4130 33.9882 64.3701 85.0383 102.2932 168.1890 0.0000 0.0000 0.0000 68.8637 96.3488 159.4094 82.9020 105.2468 168.1890 36.2556 196.8622 64.3701 85.0383 138.7499 159.4094 148.6907 0.0000 23.3858 148.6907 260.7480 23.3858 89.4149 143.2779 146.1417 133.5125 17.5473 43.2677 55.2861 66.8433 108.0315 103.1995 66.8433 108.0315 45.2233 183.7492 108.0315 95.7690 82.0510 146.1417 73.5422 138.7499 168.1890 73.5422 102.2932 159.4094 95.7690 82.0510 127.6378 113.5304 183.7492 108.0315 141.4439 9.9759 23.3858 133.5125 17.5473 64.3701 45.2233 50.9146 85.5906 141.4439 238.9917 43.2677 82.9020 105.2468 188.7008 62.2651 82.0510 146.1417 113.5304 50.9146 85.5906 22.8432 220.2245 43.2677 55.2861 167.1189 108.0315 148.6907 260.7480 23.3858 121.4130 196.8622 85.5906 103.1995 167.1189 108.0315 121.4130 33.9882 64.3701 148.6907 0.0000 23.3858 89.4149 96.3488 159.4094 82.9020 135.2862 168.1890 82.9020 105.2468 188.7008 103.1995 66.8433 108.0315 0.0000 260.7480 23.3858 10.2234 238.9917 23.3858 89.4149 96.3488 146.1417 85.0383 138.7499 168.1890 121.4130 196.8622 64.3701 95.7690 153.5077 146.1417 133.5125 17.5473 43.2677 22.8432 220.2245 43.2677 55.2861 66.8433 127.6378 45.2233 50.9146 108.0315 95.7690 153.5077 127.6378 0.0000 260.7480 0.0000 73.5422 102.2932 168.1890 113.5304 183.7492 85.5906 62.2651 82.0510 146.1417 113.5304 50.9146 108.0315 85.0383 138.7499 168.1890 68.8637 143.2779 159.4094 55.2861 167.1189 108.0315 133.5125 220.2245 43.2677 133.5125 220.2245 64.3701 22.8432 220.2245 64.3701 89.4149 143.2779 159.4094 148.6907 0.0000 23.3858 62.2651 82.0510 127.6378 10.2234 9.9759 43.2677 77.1146 105.2468 168.1890 82.9020 135.2862 168.1890 73.5422 138.7499 159.4094 10.2234 9.9759 23.3858 62.2651 153.5077 146.1417 36.2556 33.9882 85.5906 133.5125 220.2245 64.3701 85.0383 102.2932 168.1890 22.8432 220.2245 43.2677 113.5304 183.7492 108.0315 36.2556 33.9882 85.5906 22.8432 17.5473 43.2677 141.4439 238.9917 43.2677 36.2556 196.8622 64.3701 82.9020 135.2862 168.1890 45.2233 183.7492 108.0315 148.6907 260.7480 0.0000 0.0000 0.0000 23.3858 73.5422 102.2932 159.4094 82.9020 105.2468 188.7008 55.2861 66.8433 127.6378 103.1995 66.8433 127.6378 22.8432 220.2245 64.3701 36.2556 33.9882 64.3701 77.1146 105.2468 188.7008 62.2651 153.5077 127.6378 0.0000 260.7480 0.0000 22.8432 17.5473 64.3701 10.2234 238.9917 43.2677 55.2861 66.8433 108.0315 103.1995 66.8433 108.0315 73.5422 102.2932 159.4094 77.1146 105.2468 188.7008 85.0383 138.7499 159.4094 45.2233 50.9146 85.5906 133.5125 17.5473 43.2677 0.0000 0.0000 23.3858 89.4149 143.2779 146.1417 103.1995 66.8433 127.6378 22.8432 17.5473 43.2677 85.0383 102.2932 168.1890 55.2861 167.1189 127.6378 22.8432 220.2245 64.3701 68.8637 96.3488 146.1417 103.1995 167.1189 127.6378 148.6907 260.7480 23.3858 95.7690 153.5077 146.1417 133.5125 220.2245 43.2677 62.2651 82.0510 127.6378 77.1146 135.2862 188.7008 10.2234 238.9917 43.2677 141.4439 238.9917 23.3858 82.9020 105.2468 168.1890 89.4149 96.3488 159.4094 45.2233 183.7492 85.5906 73.5422 102.2932 168.1890 85.0383 102.2932 159.4094 121.4130 33.9882 85.5906 55.2861 167.1189 127.6378 77.1146 135.2862 188.7008 95.7690 153.5077 146.1417 121.4130 33.9882 85.5906 141.4439 238.9917 23.3858 133.5125 17.5473 64.3701 10.2234 9.9759 43.2677 45.2233 50.9146 108.0315 133.5125 220.2245 64.3701 89.4149 96.3488 146.1417 113.5304 50.9146 108.0315 121.4130 196.8622 64.3701 95.7690 153.5077 127.6378 133.5125 17.5473 64.3701 82.9020 105.2468 168.1890 73.5422 138.7499 159.4094 62.2651 153.5077 146.1417 89.4149 143.2779 146.1417 68.8637 143.2779 146.1417 121.4130 33.9882 64.3701 103.1995 167.1189 108.0315 141.4439 9.9759 43.2677 141.4439 9.9759 23.3858 95.7690 82.0510 146.1417 36.2556 33.9882 85.5906 77.1146 135.2862 168.1890 10.2234 238.9917 43.2677 77.1146 105.2468 188.7008 10.2234 9.9759 43.2677 62.2651 153.5077 127.6378 85.0383 102.2932 159.4094 113.5304 50.9146 85.5906 113.5304 183.7492 108.0315 55.2861 66.8433 127.6378 103.1995 66.8433 127.6378 36.2556 33.9882 64.3701 10.2234 9.9759 23.3858 36.2556 196.8622 85.5906 0.0000 260.7480 23.3858 82.9020 135.2862 188.7008 22.8432 17.5473 43.2677 62.2651 153.5077 146.1417 68.8637 96.3488 159.4094 148.6907 260.7480 0.0000 73.5422 138.7499 168.1890 85.0383 138.7499 168.1890 68.8637 96.3488 146.1417 121.4130 196.8622 85.5906 68.8637 143.2779 146.1417 141.4439 238.9917 23.3858 89.4149 143.2779 159.4094 22.8432 17.5473 64.3701 73.5422 138.7499 168.1890 45.2233 183.7492 85.5906 77.1146 105.2468 168.1890 68.8637 96.3488 146.1417 148.6907 260.7480 0.0000 141.4439 9.9759 43.2677 0.0000 260.7480 23.3858 113.5304 183.7492 85.5906 148.6907 0.0000 0.0000 55.2861 167.1189 108.0315 22.8432 17.5473 64.3701 121.4130 196.8622 64.3701 103.1995 167.1189 108.0315 10.2234 238.9917 23.3858 68.8637 143.2779 159.4094 36.2556 196.8622 85.5906 45.2233 50.9146 108.0315 77.1146 105.2468 168.1890 0.0000 260.7480 0.0000 141.4439 238.9917 43.2677 95.7690 153.5077 127.6378 95.7690 82.0510 146.1417 95.7690 82.0510 127.6378 103.1995 167.1189 127.6378 77.1146 135.2862 168.1890 36.2556 196.8622 64.3701 45.2233 183.7492 108.0315 89.4149 96.3488 159.4094 82.9020 135.2862 188.7008 121.4130 196.8622 85.5906 89.4149 143.2779 159.4094 77.1146 135.2862 168.1890 148.6907 0.0000 0.0000 45.2233 50.9146 85.5906 62.2651 153.5077 127.6378 85.0383 138.7499 159.4094 10.2234 9.9759 23.3858 141.4439 9.9759 43.2677 77.1146 135.2862 188.7008 85.0383 102.2932 159.4094 73.5422 138.7499 159.4094 95.7690 82.0510 127.6378 113.5304 50.9146 85.5906 133.5125 220.2245 43.2677 73.5422 102.2932 168.1890 141.4439 9.9759 23.3858 82.9020 135.2862 188.7008 121.4130 33.9882 85.5906 62.2651 82.0510 146.1417 62.2651 82.0510 127.6378 148.6907 0.0000 0.0000 55.2861 66.8433 108.0315 89.4149 96.3488 146.1417 55.2861 167.1189 127.6378 68.8637 143.2779 146.1417 10.2234 238.9917 23.3858 45.2233 183.7492 85.5906 68.8637 96.3488 159.4094 36.2556 33.9882 64.3701 113.5304 183.7492 85.5906 103.1995 167.1189 127.6378 36.2556 196.8622 85.5906 113.5304 50.9146 108.0315 68.8637 143.2779 159.4094 " TextureCoordinates="33.9882 64.3701 85.0383 102.2932 0.0000 0.0000 -96.3488 159.4094 82.9020 105.2468 -36.2556 64.3701 85.0383 138.7499 148.6907 0.0000 260.7480 23.3858 89.4149 143.2779 17.5473 43.2677 55.2861 66.8433 103.1995 66.8433 -45.2233 108.0315 95.7690 82.0510 -138.7499 168.1890 73.5422 159.4094 95.7690 127.6378 -113.5304 108.0315 9.9759 23.3858 17.5473 64.3701 45.2233 85.5906 238.9917 43.2677 82.9020 188.7008 -82.0510 146.1417 50.9146 85.5906 22.8432 220.2245 55.2861 167.1189 -148.6907 23.3858 -121.4130 85.5906 103.1995 167.1189 121.4130 33.9882 148.6907 23.3858 89.4149 96.3488 135.2862 168.1890 105.2468 188.7008 66.8433 108.0315 0.0000 23.3858 10.2234 238.9917 89.4149 146.1417 85.0383 138.7499 -121.4130 64.3701 -95.7690 146.1417 133.5125 17.5473 -220.2245 43.2677 -66.8433 127.6378 -50.9146 108.0315 95.7690 153.5077 -260.7480 0.0000 -102.2932 168.1890 183.7492 85.5906 62.2651 82.0510 113.5304 50.9146 -85.0383 168.1890 -68.8637 159.4094 -55.2861 108.0315 220.2245 43.2677 133.5125 220.2245 -220.2245 64.3701 89.4149 143.2779 0.0000 23.3858 62.2651 127.6378 10.2234 43.2677 77.1146 105.2468 -82.9020 168.1890 -138.7499 159.4094 10.2234 9.9759 -153.5077 146.1417 36.2556 33.9882 220.2245 64.3701 102.2932 168.1890 -22.8432 43.2677 113.5304 183.7492 36.2556 85.5906 -17.5473 43.2677 -141.4439 43.2677 36.2556 196.8622 82.9020 135.2862 -183.7492 108.0315 -148.6907 260.7480 0.0000 0.0000 73.5422 102.2932 82.9020 105.2468 55.2861 66.8433 103.1995 66.8433 -22.8432 64.3701 36.2556 64.3701 77.1146 188.7008 62.2651 153.5077 0.0000 260.7480 -17.5473 64.3701 -238.9917 43.2677 55.2861 108.0315 103.1995 108.0315 -102.2932 159.4094 -105.2468 188.7008 -85.0383 159.4094 45.2233 50.9146 133.5125 43.2677 0.0000 23.3858 143.2779 146.1417 66.8433 127.6378 22.8432 17.5473 85.0383 168.1890 55.2861 167.1189 22.8432 220.2245 68.8637 96.3488 103.1995 167.1189 148.6907 260.7480 153.5077 146.1417 -133.5125 43.2677 -82.0510 127.6378 -77.1146 188.7008 10.2234 238.9917 141.4439 238.9917 82.9020 168.1890 89.4149 159.4094 45.2233 183.7492 73.5422 102.2932 102.2932 159.4094 121.4130 33.9882 -55.2861 127.6378 -135.2862 188.7008 95.7690 153.5077 121.4130 85.5906 238.9917 23.3858 133.5125 64.3701 -9.9759 43.2677 45.2233 108.0315 -133.5125 64.3701 96.3488 146.1417 50.9146 108.0315 121.4130 196.8622 -95.7690 127.6378 133.5125 17.5473 105.2468 168.1890 73.5422 138.7499 -62.2651 146.1417 -89.4149 146.1417 68.8637 143.2779 121.4130 64.3701 -103.1995 108.0315 141.4439 43.2677 141.4439 9.9759 82.0510 146.1417 -33.9882 85.5906 77.1146 135.2862 -10.2234 43.2677 77.1146 105.2468 10.2234 9.9759 -153.5077 127.6378 85.0383 159.4094 113.5304 85.5906 183.7492 108.0315 55.2861 127.6378 103.1995 127.6378 -33.9882 64.3701 10.2234 23.3858 36.2556 196.8622 -260.7480 23.3858 82.9020 135.2862 22.8432 43.2677 62.2651 153.5077 68.8637 96.3488 260.7480 0.0000 73.5422 138.7499 138.7499 168.1890 68.8637 146.1417 196.8622 85.5906 -143.2779 146.1417 -141.4439 23.3858 143.2779 159.4094 22.8432 64.3701 -73.5422 168.1890 -45.2233 85.5906 77.1146 168.1890 -96.3488 146.1417 -148.6907 0.0000 9.9759 43.2677 0.0000 260.7480 -113.5304 85.5906 148.6907 0.0000 -167.1189 108.0315 22.8432 17.5473 196.8622 64.3701 167.1189 108.0315 -238.9917 23.3858 68.8637 143.2779 -196.8622 85.5906 45.2233 50.9146 -105.2468 168.1890 0.0000 0.0000 141.4439 238.9917 153.5077 127.6378 95.7690 146.1417 95.7690 82.0510 -103.1995 127.6378 -77.1146 168.1890 -196.8622 64.3701 45.2233 183.7492 96.3488 159.4094 135.2862 188.7008 121.4130 196.8622 -89.4149 159.4094 -135.2862 168.1890 0.0000 0.0000 -50.9146 85.5906 -62.2651 127.6378 138.7499 159.4094 -9.9759 23.3858 141.4439 9.9759 77.1146 135.2862 85.0383 102.2932 -73.5422 159.4094 82.0510 127.6378 113.5304 50.9146 133.5125 220.2245 73.5422 168.1890 141.4439 23.3858 -82.9020 188.7008 33.9882 85.5906 62.2651 146.1417 62.2651 82.0510 -148.6907 0.0000 -66.8433 108.0315 89.4149 96.3488 -167.1189 127.6378 -68.8637 146.1417 -10.2234 23.3858 -183.7492 85.5906 68.8637 159.4094 36.2556 33.9882 113.5304 183.7492 167.1189 127.6378 -36.2556 85.5906 113.5304 108.0315 -143.2779 159.4094 " TriangleIndices="79 2 89 2 79 223 80 66 179 66 80 7 66 7 143 143 7 114 179 38 108 38 179 66 108 38 114 108 114 7 32 2 181 2 32 99 159 2 99 2 159 48 37 177 191 177 37 28 164 60 205 60 164 8 149 102 113 102 149 210 102 210 43 43 210 216 113 26 192 26 113 102 192 26 216 192 216 210 91 209 127 209 91 186 142 157 218 157 142 62 125 178 19 178 125 22 147 170 228 170 147 75 183 231 105 231 183 134 231 134 31 31 134 132 105 76 57 76 105 231 57 76 132 57 132 134 58 74 90 74 58 44 126 161 98 161 126 172 56 20 10 20 56 69 85 110 71 110 85 129 68 97 158 97 68 120 97 120 215 215 120 232 158 117 202 117 158 97 202 117 232 202 232 120 184 220 0 220 184 168 234 41 5 41 234 29 188 156 145 156 188 198 124 86 140 86 124 73 189 11 199 11 189 52 11 52 12 12 52 30 199 27 72 27 199 11 72 27 30 72 30 52 235 21 152 21 235 128 50 131 25 131 50 153 13 180 174 180 13 18 78 206 46 206 78 229 83 222 104 222 83 84 222 84 195 195 84 47 104 88 107 88 104 222 107 88 47 107 47 84 155 92 93 92 155 154 185 101 36 101 185 233 121 141 55 141 121 196 226 224 45 224 226 182 51 106 162 106 51 14 106 14 225 225 14 9 162 139 123 139 162 106 123 139 9 123 9 14 194 61 17 61 194 221 193 144 214 144 193 109 137 133 207 133 137 42 67 111 24 111 67 150 163 81 187 81 163 33 81 33 212 212 33 6 187 136 59 136 187 81 59 136 6 59 6 33 236 176 3 176 236 169 116 167 39 167 116 230 100 200 130 200 100 171 54 138 227 138 54 203 118 63 165 63 118 1 63 1 4 4 1 77 165 146 40 146 165 63 40 146 77 40 77 1 173 96 213 96 173 53 15 94 49 94 15 65 103 16 151 16 103 217 208 70 119 70 208 166 82 211 148 211 82 160 23 175 115 175 23 87 34 35 135 35 34 201 112 64 197 64 112 219 122 190 95 190 122 204 " /> My grid size is MaxHeight="1000" MaxWidth="1000" How can I make the camera in the viewport3d show all the model something like fit view? Thank you

    Read the article

  • Upload File to Windows Azure Blob in Chunks through ASP.NET MVC, JavaScript and HTML5

    - by Shaun
    Originally posted on: http://geekswithblogs.net/shaunxu/archive/2013/07/01/upload-file-to-windows-azure-blob-in-chunks-through-asp.net.aspxMany people are using Windows Azure Blob Storage to store their data in the cloud. Blob storage provides 99.9% availability with easy-to-use API through .NET SDK and HTTP REST. For example, we can store JavaScript files, images, documents in blob storage when we are building an ASP.NET web application on a Web Role in Windows Azure. Or we can store our VHD files in blob and mount it as a hard drive in our cloud service. If you are familiar with Windows Azure, you should know that there are two kinds of blob: page blob and block blob. The page blob is optimized for random read and write, which is very useful when you need to store VHD files. The block blob is optimized for sequential/chunk read and write, which has more common usage. Since we can upload block blob in blocks through BlockBlob.PutBlock, and them commit them as a whole blob with invoking the BlockBlob.PutBlockList, it is very powerful to upload large files, as we can upload blocks in parallel, and provide pause-resume feature. There are many documents, articles and blog posts described on how to upload a block blob. Most of them are focus on the server side, which means when you had received a big file, stream or binaries, how to upload them into blob storage in blocks through .NET SDK.  But the problem is, how can we upload these large files from client side, for example, a browser. This questioned to me when I was working with a Chinese customer to help them build a network disk production on top of azure. The end users upload their files from the web portal, and then the files will be stored in blob storage from the Web Role. My goal is to find the best way to transform the file from client (end user’s machine) to the server (Web Role) through browser. In this post I will demonstrate and describe what I had done, to upload large file in chunks with high speed, and save them as blocks into Windows Azure Blob Storage.   Traditional Upload, Works with Limitation The simplest way to implement this requirement is to create a web page with a form that contains a file input element and a submit button. 1: @using (Html.BeginForm("About", "Index", FormMethod.Post, new { enctype = "multipart/form-data" })) 2: { 3: <input type="file" name="file" /> 4: <input type="submit" value="upload" /> 5: } And then in the backend controller, we retrieve the whole content of this file and upload it in to the blob storage through .NET SDK. We can split the file in blocks and upload them in parallel and commit. The code had been well blogged in the community. 1: [HttpPost] 2: public ActionResult About(HttpPostedFileBase file) 3: { 4: var container = _client.GetContainerReference("test"); 5: container.CreateIfNotExists(); 6: var blob = container.GetBlockBlobReference(file.FileName); 7: var blockDataList = new Dictionary<string, byte[]>(); 8: using (var stream = file.InputStream) 9: { 10: var blockSizeInKB = 1024; 11: var offset = 0; 12: var index = 0; 13: while (offset < stream.Length) 14: { 15: var readLength = Math.Min(1024 * blockSizeInKB, (int)stream.Length - offset); 16: var blockData = new byte[readLength]; 17: offset += stream.Read(blockData, 0, readLength); 18: blockDataList.Add(Convert.ToBase64String(BitConverter.GetBytes(index)), blockData); 19:  20: index++; 21: } 22: } 23:  24: Parallel.ForEach(blockDataList, (bi) => 25: { 26: blob.PutBlock(bi.Key, new MemoryStream(bi.Value), null); 27: }); 28: blob.PutBlockList(blockDataList.Select(b => b.Key).ToArray()); 29:  30: return RedirectToAction("About"); 31: } This works perfect if we selected an image, a music or a small video to upload. But if I selected a large file, let’s say a 6GB HD-movie, after upload for about few minutes the page will be shown as below and the upload will be terminated. In ASP.NET there is a limitation of request length and the maximized request length is defined in the web.config file. It’s a number which less than about 4GB. So if we want to upload a really big file, we cannot simply implement in this way. Also, in Windows Azure, a cloud service network load balancer will terminate the connection if exceed the timeout period. From my test the timeout looks like 2 - 3 minutes. Hence, when we need to upload a large file we cannot just use the basic HTML elements. Besides the limitation mentioned above, the simple HTML file upload cannot provide rich upload experience such as chunk upload, pause and pause-resume. So we need to find a better way to upload large file from the client to the server.   Upload in Chunks through HTML5 and JavaScript In order to break those limitation mentioned above we will try to upload the large file in chunks. This takes some benefit to us such as - No request size limitation: Since we upload in chunks, we can define the request size for each chunks regardless how big the entire file is. - No timeout problem: The size of chunks are controlled by us, which means we should be able to make sure request for each chunk upload will not exceed the timeout period of both ASP.NET and Windows Azure load balancer. It was a big challenge to upload big file in chunks until we have HTML5. There are some new features and improvements introduced in HTML5 and we will use them to implement our solution.   In HTML5, the File interface had been improved with a new method called “slice”. It can be used to read part of the file by specifying the start byte index and the end byte index. For example if the entire file was 1024 bytes, file.slice(512, 768) will read the part of this file from the 512nd byte to 768th byte, and return a new object of interface called "Blob”, which you can treat as an array of bytes. In fact,  a Blob object represents a file-like object of immutable, raw data. The File interface is based on Blob, inheriting blob functionality and expanding it to support files on the user's system. For more information about the Blob please refer here. File and Blob is very useful to implement the chunk upload. We will use File interface to represent the file the user selected from the browser and then use File.slice to read the file in chunks in the size we wanted. For example, if we wanted to upload a 10MB file with 512KB chunks, then we can read it in 512KB blobs by using File.slice in a loop.   Assuming we have a web page as below. User can select a file, an input box to specify the block size in KB and a button to start upload. 1: <div> 2: <input type="file" id="upload_files" name="files[]" /><br /> 3: Block Size: <input type="number" id="block_size" value="512" name="block_size" />KB<br /> 4: <input type="button" id="upload_button_blob" name="upload" value="upload (blob)" /> 5: </div> Then we can have the JavaScript function to upload the file in chunks when user clicked the button. 1: <script type="text/javascript"> 1: 2: $(function () { 3: $("#upload_button_blob").click(function () { 4: }); 5: });</script> Firstly we need to ensure the client browser supports the interfaces we are going to use. Just try to invoke the File, Blob and FormData from the “window” object. If any of them is “undefined” the condition result will be “false” which means your browser doesn’t support these premium feature and it’s time for you to get your browser updated. FormData is another new feature we are going to use in the future. It could generate a temporary form for us. We will use this interface to create a form with chunk and associated metadata when invoked the service through ajax. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: if (window.File && window.Blob && window.FormData) { 4: alert("Your brwoser is awesome, let's rock!"); 5: } 6: else { 7: alert("Oh man plz update to a modern browser before try is cool stuff out."); 8: return; 9: } 10: }); Each browser supports these interfaces by their own implementation and currently the Blob, File and File.slice are supported by Chrome 21, FireFox 13, IE 10, Opera 12 and Safari 5.1 or higher. After that we worked on the files the user selected one by one since in HTML5, user can select multiple files in one file input box. 1: var files = $("#upload_files")[0].files; 2: for (var i = 0; i < files.length; i++) { 3: var file = files[i]; 4: var fileSize = file.size; 5: var fileName = file.name; 6: } Next, we calculated the start index and end index for each chunks based on the size the user specified from the browser. We put them into an array with the file name and the index, which will be used when we upload chunks into Windows Azure Blob Storage as blocks since we need to specify the target blob name and the block index. At the same time we will store the list of all indexes into another variant which will be used to commit blocks into blob in Azure Storage once all chunks had been uploaded successfully. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10:  11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: var blockSizeInKB = $("#block_size").val(); 14: var blockSize = blockSizeInKB * 1024; 15: var blocks = []; 16: var offset = 0; 17: var index = 0; 18: var list = ""; 19: while (offset < fileSize) { 20: var start = offset; 21: var end = Math.min(offset + blockSize, fileSize); 22:  23: blocks.push({ 24: name: fileName, 25: index: index, 26: start: start, 27: end: end 28: }); 29: list += index + ","; 30:  31: offset = end; 32: index++; 33: } 34: } 35: }); Now we have all chunks’ information ready. The next step should be upload them one by one to the server side, and at the server side when received a chunk it will upload as a block into Blob Storage, and finally commit them with the index list through BlockBlobClient.PutBlockList. But since all these invokes are ajax calling, which means not synchronized call. So we need to introduce a new JavaScript library to help us coordinate the asynchronize operation, which named “async.js”. You can download this JavaScript library here, and you can find the document here. I will not explain this library too much in this post. We will put all procedures we want to execute as a function array, and pass into the proper function defined in async.js to let it help us to control the execution sequence, in series or in parallel. Hence we will define an array and put the function for chunk upload into this array. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4:  5: // start to upload each files in chunks 6: var files = $("#upload_files")[0].files; 7: for (var i = 0; i < files.length; i++) { 8: var file = files[i]; 9: var fileSize = file.size; 10: var fileName = file.name; 11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: ... ... 14:  15: // define the function array and push all chunk upload operation into this array 16: blocks.forEach(function (block) { 17: putBlocks.push(function (callback) { 18: }); 19: }); 20: } 21: }); 22: }); As you can see, I used File.slice method to read each chunks based on the start and end byte index we calculated previously, and constructed a temporary HTML form with the file name, chunk index and chunk data through another new feature in HTML5 named FormData. Then post this form to the backend server through jQuery.ajax. This is the key part of our solution. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: blocks.forEach(function (block) { 15: putBlocks.push(function (callback) { 16: // load blob based on the start and end index for each chunks 17: var blob = file.slice(block.start, block.end); 18: // put the file name, index and blob into a temporary from 19: var fd = new FormData(); 20: fd.append("name", block.name); 21: fd.append("index", block.index); 22: fd.append("file", blob); 23: // post the form to backend service (asp.net mvc controller action) 24: $.ajax({ 25: url: "/Home/UploadInFormData", 26: data: fd, 27: processData: false, 28: contentType: "multipart/form-data", 29: type: "POST", 30: success: function (result) { 31: if (!result.success) { 32: alert(result.error); 33: } 34: callback(null, block.index); 35: } 36: }); 37: }); 38: }); 39: } 40: }); Then we will invoke these functions one by one by using the async.js. And once all functions had been executed successfully I invoked another ajax call to the backend service to commit all these chunks (blocks) as the blob in Windows Azure Storage. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.series(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); That’s all in the client side. The outline of our logic would be - Calculate the start and end byte index for each chunks based on the block size. - Defined the functions of reading the chunk form file and upload the content to the backend service through ajax. - Execute the functions defined in previous step with “async.js”. - Commit the chunks by invoking the backend service in Windows Azure Storage finally.   Save Chunks as Blocks into Blob Storage In above we finished the client size JavaScript code. It uploaded the file in chunks to the backend service which we are going to implement in this step. We will use ASP.NET MVC as our backend service, and it will receive the chunks, upload into Windows Azure Bob Storage in blocks, then finally commit as one blob. As in the client side we uploaded chunks by invoking the ajax call to the URL "/Home/UploadInFormData", I created a new action under the Index controller and it only accepts HTTP POST request. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: } 8: catch (Exception e) 9: { 10: error = e.ToString(); 11: } 12:  13: return new JsonResult() 14: { 15: Data = new 16: { 17: success = string.IsNullOrWhiteSpace(error), 18: error = error 19: } 20: }; 21: } Then I retrieved the file name, index and the chunk content from the Request.Form object, which was passed from our client side. And then, used the Windows Azure SDK to create a blob container (in this case we will use the container named “test”.) and create a blob reference with the blob name (same as the file name). Then uploaded the chunk as a block of this blob with the index, since in Blob Storage each block must have an index (ID) associated with so that finally we can put all blocks as one blob by specifying their block ID list. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var index = int.Parse(Request.Form["index"]); 9: var file = Request.Files[0]; 10: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 11:  12: var container = _client.GetContainerReference("test"); 13: container.CreateIfNotExists(); 14: var blob = container.GetBlockBlobReference(name); 15: blob.PutBlock(id, file.InputStream, null); 16: } 17: catch (Exception e) 18: { 19: error = e.ToString(); 20: } 21:  22: return new JsonResult() 23: { 24: Data = new 25: { 26: success = string.IsNullOrWhiteSpace(error), 27: error = error 28: } 29: }; 30: } Next, I created another action to commit the blocks into blob once all chunks had been uploaded. Similarly, I retrieved the blob name from the Request.Form. I also retrieved the chunks ID list, which is the block ID list from the Request.Form in a string format, split them as a list, then invoked the BlockBlob.PutBlockList method. After that our blob will be shown in the container and ready to be download. 1: [HttpPost] 2: public JsonResult Commit() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var list = Request.Form["list"]; 9: var ids = list 10: .Split(',') 11: .Where(id => !string.IsNullOrWhiteSpace(id)) 12: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 13: .ToArray(); 14:  15: var container = _client.GetContainerReference("test"); 16: container.CreateIfNotExists(); 17: var blob = container.GetBlockBlobReference(name); 18: blob.PutBlockList(ids); 19: } 20: catch (Exception e) 21: { 22: error = e.ToString(); 23: } 24:  25: return new JsonResult() 26: { 27: Data = new 28: { 29: success = string.IsNullOrWhiteSpace(error), 30: error = error 31: } 32: }; 33: } Now we finished all code we need. The whole process of uploading would be like this below. Below is the full client side JavaScript code. 1: <script type="text/javascript" src="~/Scripts/async.js"></script> 2: <script type="text/javascript"> 3: $(function () { 4: $("#upload_button_blob").click(function () { 5: // assert the browser support html5 6: if (window.File && window.Blob && window.FormData) { 7: alert("Your brwoser is awesome, let's rock!"); 8: } 9: else { 10: alert("Oh man plz update to a modern browser before try is cool stuff out."); 11: return; 12: } 13:  14: // start to upload each files in chunks 15: var files = $("#upload_files")[0].files; 16: for (var i = 0; i < files.length; i++) { 17: var file = files[i]; 18: var fileSize = file.size; 19: var fileName = file.name; 20:  21: // calculate the start and end byte index for each blocks(chunks) 22: // with the index, file name and index list for future using 23: var blockSizeInKB = $("#block_size").val(); 24: var blockSize = blockSizeInKB * 1024; 25: var blocks = []; 26: var offset = 0; 27: var index = 0; 28: var list = ""; 29: while (offset < fileSize) { 30: var start = offset; 31: var end = Math.min(offset + blockSize, fileSize); 32:  33: blocks.push({ 34: name: fileName, 35: index: index, 36: start: start, 37: end: end 38: }); 39: list += index + ","; 40:  41: offset = end; 42: index++; 43: } 44:  45: // define the function array and push all chunk upload operation into this array 46: var putBlocks = []; 47: blocks.forEach(function (block) { 48: putBlocks.push(function (callback) { 49: // load blob based on the start and end index for each chunks 50: var blob = file.slice(block.start, block.end); 51: // put the file name, index and blob into a temporary from 52: var fd = new FormData(); 53: fd.append("name", block.name); 54: fd.append("index", block.index); 55: fd.append("file", blob); 56: // post the form to backend service (asp.net mvc controller action) 57: $.ajax({ 58: url: "/Home/UploadInFormData", 59: data: fd, 60: processData: false, 61: contentType: "multipart/form-data", 62: type: "POST", 63: success: function (result) { 64: if (!result.success) { 65: alert(result.error); 66: } 67: callback(null, block.index); 68: } 69: }); 70: }); 71: }); 72:  73: // invoke the functions one by one 74: // then invoke the commit ajax call to put blocks into blob in azure storage 75: async.series(putBlocks, function (error, result) { 76: var data = { 77: name: fileName, 78: list: list 79: }; 80: $.post("/Home/Commit", data, function (result) { 81: if (!result.success) { 82: alert(result.error); 83: } 84: else { 85: alert("done!"); 86: } 87: }); 88: }); 89: } 90: }); 91: }); 92: </script> And below is the full ASP.NET MVC controller code. 1: public class HomeController : Controller 2: { 3: private CloudStorageAccount _account; 4: private CloudBlobClient _client; 5:  6: public HomeController() 7: : base() 8: { 9: _account = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("DataConnectionString")); 10: _client = _account.CreateCloudBlobClient(); 11: } 12:  13: public ActionResult Index() 14: { 15: ViewBag.Message = "Modify this template to jump-start your ASP.NET MVC application."; 16:  17: return View(); 18: } 19:  20: [HttpPost] 21: public JsonResult UploadInFormData() 22: { 23: var error = string.Empty; 24: try 25: { 26: var name = Request.Form["name"]; 27: var index = int.Parse(Request.Form["index"]); 28: var file = Request.Files[0]; 29: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 30:  31: var container = _client.GetContainerReference("test"); 32: container.CreateIfNotExists(); 33: var blob = container.GetBlockBlobReference(name); 34: blob.PutBlock(id, file.InputStream, null); 35: } 36: catch (Exception e) 37: { 38: error = e.ToString(); 39: } 40:  41: return new JsonResult() 42: { 43: Data = new 44: { 45: success = string.IsNullOrWhiteSpace(error), 46: error = error 47: } 48: }; 49: } 50:  51: [HttpPost] 52: public JsonResult Commit() 53: { 54: var error = string.Empty; 55: try 56: { 57: var name = Request.Form["name"]; 58: var list = Request.Form["list"]; 59: var ids = list 60: .Split(',') 61: .Where(id => !string.IsNullOrWhiteSpace(id)) 62: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 63: .ToArray(); 64:  65: var container = _client.GetContainerReference("test"); 66: container.CreateIfNotExists(); 67: var blob = container.GetBlockBlobReference(name); 68: blob.PutBlockList(ids); 69: } 70: catch (Exception e) 71: { 72: error = e.ToString(); 73: } 74:  75: return new JsonResult() 76: { 77: Data = new 78: { 79: success = string.IsNullOrWhiteSpace(error), 80: error = error 81: } 82: }; 83: } 84: } And if we selected a file from the browser we will see our application will upload chunks in the size we specified to the server through ajax call in background, and then commit all chunks in one blob. Then we can find the blob in our Windows Azure Blob Storage.   Optimized by Parallel Upload In previous example we just uploaded our file in chunks. This solved the problem that ASP.NET MVC request content size limitation as well as the Windows Azure load balancer timeout. But it might introduce the performance problem since we uploaded chunks in sequence. In order to improve the upload performance we could modify our client side code a bit to make the upload operation invoked in parallel. The good news is that, “async.js” library provides the parallel execution function. If you remembered the code we invoke the service to upload chunks, it utilized “async.series” which means all functions will be executed in sequence. Now we will change this code to “async.parallel”. This will invoke all functions in parallel. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallel(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); In this way all chunks will be uploaded to the server side at the same time to maximize the bandwidth usage. This should work if the file was not very large and the chunk size was not very small. But for large file this might introduce another problem that too many ajax calls are sent to the server at the same time. So the best solution should be, upload the chunks in parallel with maximum concurrency limitation. The code below specified the concurrency limitation to 4, which means at the most only 4 ajax calls could be invoked at the same time. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallelLimit(putBlocks, 4, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: });   Summary In this post we discussed how to upload files in chunks to the backend service and then upload them into Windows Azure Blob Storage in blocks. We focused on the frontend side and leverage three new feature introduced in HTML 5 which are - File.slice: Read part of the file by specifying the start and end byte index. - Blob: File-like interface which contains the part of the file content. - FormData: Temporary form element that we can pass the chunk alone with some metadata to the backend service. Then we discussed the performance consideration of chunk uploading. Sequence upload cannot provide maximized upload speed, but the unlimited parallel upload might crash the browser and server if too many chunks. So we finally came up with the solution to upload chunks in parallel with the concurrency limitation. We also demonstrated how to utilize “async.js” JavaScript library to help us control the asynchronize call and the parallel limitation.   Regarding the chunk size and the parallel limitation value there is no “best” value. You need to test vary composition and find out the best one for your particular scenario. It depends on the local bandwidth, client machine cores and the server side (Windows Azure Cloud Service Virtual Machine) cores, memory and bandwidth. Below is one of my performance test result. The client machine was Windows 8 IE 10 with 4 cores. I was using Microsoft Cooperation Network. The web site was hosted on Windows Azure China North data center (in Beijing) with one small web role (1.7GB 1 core CPU, 1.75GB memory with 100Mbps bandwidth). The test cases were - Chunk size: 512KB, 1MB, 2MB, 4MB. - Upload Mode: Sequence, parallel (unlimited), parallel with limit (4 threads, 8 threads). - Chunk Format: base64 string, binaries. - Target file: 100MB. - Each case was tested 3 times. Below is the test result chart. Some thoughts, but not guidance or best practice: - Parallel gets better performance than series. - No significant performance improvement between parallel 4 threads and 8 threads. - Transform with binaries provides better performance than base64. - In all cases, chunk size in 1MB - 2MB gets better performance.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Could not load type 'Default.DataMatch' in DataMatch.aspx file

    - by salvationishere
    I am developing a C# VS 2008 / SQL Server 2008 website, but now I am getting the above error when I build it. I included the Default.aspx, Default.aspx.cs, DataMatch.aspx, and DataMatch.aspx.cs files below. What do I need to do to fix this? Default.aspx: <%@ Page Language="C#" MasterPageFile="~/Site.master" AutoEventWireup="true" CodeFile="Default.aspx.cs" Inherits="_Default" Title="Untitled Page" %> ... DataMatch.aspx: <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="DataMatch.aspx.cs" Inherits="_Default.DataMatch" %> ... Default.aspx.cs: using System; using System.Collections; using System.Configuration; using System.Data; using System.Linq; using System.Web; using System.Web.Security; using System.Web.UI; using System.Web.UI.HtmlControls; using System.Web.UI.WebControls; using System.Web.UI.WebControls.WebParts; using System.Xml.Linq; using System.Collections.Generic; using System.IO; using System.Drawing; using System.ComponentModel; using System.Data.SqlClient; using ADONET_namespace; using System.Security.Principal; //using System.Windows; public partial class _Default : System.Web.UI.Page //namespace AddFileToSQL { //protected System.Web.UI.HtmlControls.HtmlInputFile uploadFile; //protected System.Web.UI.HtmlControls.HtmlInputButton btnOWrite; //protected System.Web.UI.HtmlControls.HtmlInputButton btnAppend; protected System.Web.UI.WebControls.Label Label1; protected static string inputfile = ""; public static string targettable; public static string selection; // Number of controls added to view state protected int default_NumberOfControls { get { if (ViewState["default_NumberOfControls"] != null) { return (int)ViewState["default_NumberOfControls"]; } else { return 0; } } set { ViewState["default_NumberOfControls"] = value; } } protected void uploadFile_onclick(object sender, EventArgs e) { } protected void Load_GridData() { //GridView1.DataSource = ADONET_methods.DisplaySchemaTables(); //GridView1.DataBind(); } protected void btnOWrite_Click(object sender, EventArgs e) { if (uploadFile.PostedFile.ContentLength > 0) { feedbackLabel.Text = "You do not have sufficient access to overwrite table records."; } else { feedbackLabel.Text = "This file does not contain any data."; } } protected void btnAppend_Click(object sender, EventArgs e) { string fullpath = Page.Request.PhysicalApplicationPath; string path = uploadFile.PostedFile.FileName; if (File.Exists(path)) { // Create a file to write to. try { StreamReader sr = new StreamReader(path); string s = ""; while (sr.Peek() > 0) s = sr.ReadLine(); sr.Close(); } catch (IOException exc) { Console.WriteLine(exc.Message + "Cannot open file."); return; } } if (uploadFile.PostedFile.ContentLength > 0) { inputfile = System.IO.File.ReadAllText(path); Session["Message"] = inputfile; Response.Redirect("DataMatch.aspx"); } else { feedbackLabel.Text = "This file does not contain any data."; } } protected void Page_Load(object sender, EventArgs e) { if (Request.IsAuthenticated) { WelcomeBackMessage.Text = "Welcome back, " + User.Identity.Name + "!"; // Reference the CustomPrincipal / CustomIdentity CustomIdentity ident = User.Identity as CustomIdentity; if (ident != null) WelcomeBackMessage.Text += string.Format(" You are the {0} of {1}.", ident.Title, ident.CompanyName); AuthenticatedMessagePanel.Visible = true; AnonymousMessagePanel.Visible = false; if (!Page.IsPostBack) { Load_GridData(); } } else { AuthenticatedMessagePanel.Visible = false; AnonymousMessagePanel.Visible = true; } } protected void GridView1_SelectedIndexChanged(object sender, EventArgs e) { GridViewRow row = GridView1.SelectedRow; targettable = row.Cells[2].Text; } } DataMatch.aspx.cs: using System; using System.Collections; using System.Collections.Generic; using System.Data; using System.Data.SqlClient; using System.Diagnostics; using System.Web; using System.Web.UI; using System.Web.UI.WebControls; using ADONET_namespace; //using MatrixApp; //namespace AddFileToSQL //{ public partial class DataMatch : AddFileToSQL._Default { protected System.Web.UI.WebControls.PlaceHolder phTextBoxes; protected System.Web.UI.WebControls.PlaceHolder phDropDownLists; protected System.Web.UI.WebControls.Button btnAnotherRequest; protected System.Web.UI.WebControls.Panel pnlCreateData; protected System.Web.UI.WebControls.Literal lTextData; protected System.Web.UI.WebControls.Panel pnlDisplayData; protected static string inputfile2; static string[] headers = null; static string[] data = null; static string[] data2 = null; static DataTable myInputFile = new DataTable("MyInputFile"); static string[] myUserSelections; static bool restart = false; private DropDownList[] newcol; int @temp = 0; string @tempS = ""; string @tempT = ""; // a Property that manages a counter stored in ViewState protected int NumberOfControls { get { return (int)ViewState["NumControls"]; } set { ViewState["NumControls"] = value; } } private Hashtable ddl_ht { get { return (Hashtable)ViewState["ddl_ht"]; } set { ViewState["ddl_ht"] = value; } } // Page Load private void Page_Load(object sender, System.EventArgs e) { if (!Page.IsPostBack) { ddl_ht = new Hashtable(); this.NumberOfControls = 0; } } // This data comes from input file private void PopulateFileInputTable() { myInputFile.Columns.Clear(); string strInput, newrow; string[] oneRow; DataColumn myDataColumn; DataRow myDataRow; int result, numRows; //Read the input file strInput = Session["Message"].ToString(); data = strInput.Split('\r'); //Headers headers = data[0].Split('|'); //Data for (int i = 0; i < data.Length; i++) { newrow = data[i].TrimStart('\n'); data[i] = newrow; } result = String.Compare(data[data.Length - 1], ""); numRows = data.Length; if (result == 0) { numRows = numRows - 1; } data2 = new string[numRows]; for (int a = 0, b = 0; a < numRows; a++, b++) { data2[b] = data[a]; } // Create columns for (int col = 0; col < headers.Length; col++) { @temp = (col + 1); @tempS = @temp.ToString(); @tempT = "@col"+ @temp.ToString(); myDataColumn = new DataColumn(); myDataColumn.DataType = Type.GetType("System.String"); myDataColumn.ColumnName = headers[col]; myInputFile.Columns.Add(myDataColumn); ddl_ht.Add(@tempT, headers[col]); } // Create new DataRow objects and add to DataTable. for (int r = 0; r < numRows - 1; r++) { oneRow = data2[r + 1].Split('|'); myDataRow = myInputFile.NewRow(); for (int c = 0; c < headers.Length; c++) { myDataRow[c] = oneRow[c]; } myInputFile.Rows.Add(myDataRow); } NumberOfControls = headers.Length; myUserSelections = new string[NumberOfControls]; } //Create display panel private void CreateDisplayPanel() { btnSubmit.Style.Add("top", "auto"); btnSubmit.Style.Add("left", "auto"); btnSubmit.Style.Add("position", "absolute"); btnSubmit.Style.Add("top", "200px"); btnSubmit.Style.Add("left", "400px"); newcol = CreateDropDownLists(); for (int counter = 0; counter < NumberOfControls; counter++) { pnlDisplayData.Controls.Add(newcol[counter]); pnlDisplayData.Controls.Add(new LiteralControl("<br><br><br>")); pnlDisplayData.Visible = true; pnlDisplayData.FindControl(newcol[counter].ID); } } //Recreate display panel private void RecreateDisplayPanel() { btnSubmit.Style.Add("top", "auto"); btnSubmit.Style.Add("left", "auto"); btnSubmit.Style.Add("position", "absolute"); btnSubmit.Style.Add("top", "200px"); btnSubmit.Style.Add("left", "400px"); newcol = RecreateDropDownLists(); for (int counter = 0; counter < NumberOfControls; counter++) { pnlDisplayData.Controls.Add(newcol[counter]); pnlDisplayData.Controls.Add(new LiteralControl("<br><br><br>")); pnlDisplayData.Visible = true; pnlDisplayData.FindControl(newcol[counter].ID); } } // Add DropDownList Control to Placeholder private DropDownList[] CreateDropDownLists() { DropDownList[] dropDowns = new DropDownList[NumberOfControls]; for (int counter = 0; counter < NumberOfControls; counter++) { DropDownList ddl = new DropDownList(); SqlDataReader dr2 = ADONET_methods.DisplayTableColumns(targettable); ddl.ID = "DropDownListID" + counter.ToString(); int NumControls = targettable.Length; DataTable dt = new DataTable(); dt.Load(dr2); ddl.DataValueField = "COLUMN_NAME"; ddl.DataTextField = "COLUMN_NAME"; ddl.DataSource = dt; ddl.SelectedIndexChanged += new EventHandler(ddlList_SelectedIndexChanged); ddl.DataBind(); ddl.AutoPostBack = true; ddl.EnableViewState = true; //Preserves View State info on Postbacks dr2.Close(); ddl.Items.Add("IGNORE"); dropDowns[counter] = ddl; } return dropDowns; } protected void ddlList_SelectedIndexChanged(object sender, EventArgs e) { DropDownList ddl = (DropDownList)sender; string ID = ddl.ID; } // Add TextBoxes Control to Placeholder private DropDownList[] RecreateDropDownLists() { DropDownList[] dropDowns = new DropDownList[NumberOfControls]; for (int counter = 0; counter < NumberOfControls; counter++) { DropDownList ddl = new DropDownList(); SqlDataReader dr2 = ADONET_methods.DisplayTableColumns(targettable); ddl.ID = "DropDownListID" + counter.ToString(); int NumControls = targettable.Length; DataTable dt = new DataTable(); dt.Load(dr2); ddl.DataValueField = "COLUMN_NAME"; ddl.DataTextField = "COLUMN_NAME"; ddl.DataSource = dt; ddl.SelectedIndexChanged += new EventHandler(ddlList_SelectedIndexChanged); ddl.DataBind(); ddl.AutoPostBack = true; ddl.EnableViewState = false; //Preserves View State info on Postbacks dr2.Close(); ddl.Items.Add("IGNORE"); dropDowns[counter] = ddl; } return dropDowns; } private void CreateLabels() { for (int counter = 0; counter < NumberOfControls; counter++) { Label lbl = new Label(); lbl.ID = "Label" + counter.ToString(); lbl.Text = headers[counter]; lbl.Style["position"] = "absolute"; lbl.Style["top"] = 60 * counter + 10 + "px"; lbl.Style["left"] = 250 + "px"; pnlDisplayData.Controls.Add(lbl); pnlDisplayData.Controls.Add(new LiteralControl("<br><br><br>")); } } // Add TextBoxes Control to Placeholder private void RecreateLabels() { for (int counter = 0; counter < NumberOfControls; counter++) { Label lbl = new Label(); lbl.ID = "Label" + counter.ToString(); lbl.Text = headers[counter]; lbl.Style["position"] = "absolute"; lbl.Style["top"] = 60 * counter + 10 + "px"; lbl.Style["left"] = 250 + "px"; pnlDisplayData.Controls.Add(lbl); pnlDisplayData.Controls.Add(new LiteralControl("<br><br><br>")); } } // Create TextBoxes and DropDownList data here on postback. protected override void CreateChildControls() { // create the child controls if the server control does not contains child controls this.EnsureChildControls(); // Creates a new ControlCollection. this.CreateControlCollection(); // Here we are recreating controls to persist the ViewState on every post back if (Page.IsPostBack) { RecreateDisplayPanel(); RecreateLabels(); } // Create these conrols when asp.net page is created else { PopulateFileInputTable(); CreateDisplayPanel(); CreateLabels(); } // Prevent dropdownlists and labels from being created again. if (restart == false) { this.ChildControlsCreated = true; } else if (restart == true) { this.ChildControlsCreated = false; } } private void AppendRecords() { switch (targettable) { case "ContactType": for (int r = 0; r < myInputFile.Rows.Count; r++) { resultLabel.Text = ADONET_methods.AppendDataCT(myInputFile.Rows[r], ddl_ht); } break; case "Contact": for (int r = 0; r < myInputFile.Rows.Count; r++) { resultLabel.Text = ADONET_methods.AppendDataC(myInputFile.Rows[r], ddl_ht); } break; case "AddressType": for (int r = 0; r < myInputFile.Rows.Count; r++) { resultLabel.Text = ADONET_methods.AppendDataAT(myInputFile.Rows[r], ddl_ht); } break; default: resultLabel.Text = "You do not have access to modify this table. Please select a different target table and try again."; restart = true; break; //throw new ArgumentOutOfRangeException("targettable type", targettable); } } // Read all the data from TextBoxes and DropDownLists protected void btnSubmit_Click(object sender, System.EventArgs e) { //int cnt = FindOccurence("DropDownListID"); AppendRecords(); pnlDisplayData.Visible = false; btnSubmit.Visible = false; resultLabel.Attributes.Add("style", "align:center"); btnSubmit.Style.Add("top", "auto"); btnSubmit.Style.Add("left", "auto"); btnSubmit.Style.Add("position", "absolute"); int bSubmitPosition = NumberOfControls; btnSubmit.Style.Add("top", System.Convert.ToString(bSubmitPosition)+"px"); resultLabel.Visible = true; Instructions.Visible = false; if (restart == true) { CreateChildControls(); } } private int FindOccurence(string substr) { string reqstr = Request.Form.ToString(); return ((reqstr.Length - reqstr.Replace(substr, "").Length) / substr.Length); } #region Web Form Designer generated code override protected void OnInit(EventArgs e) { // // CODEGEN: This call is required by the ASP.NET Web Form Designer. // InitializeComponent(); base.OnInit(e); } /// <summary> /// Required method for Designer support - do not modify /// the contents of this method with the code editor. /// </summary> private void InitializeComponent() { } #endregion } //}

    Read the article

  • Using Web Services from an XNA 4.0 WP7 Game

    - by Michael Cummings
    Now that the Windows Phone 7 development tools have been out for a while, let’s talk about how you can use them. Windows Phone 7 ( WP7 ) has two application types that you can create, either Silverlight or XNA, and you can’t really mix the two together. The development environment for WP7 is a special edition of Visual Studio 2010 called Visual Studio 2010 Express for Windows Phone. This edition will be installed with the WP7 tools, even if you have a full edition of VS2010 already installed. While you can use your full edition of VS2010 to do WP7 development, this astute developer has noticed that there are a few things that you can only do in the Express for Windows Phone edition. So lets start by discussing WP7 networking. On the WP7 platform the only networking available is through Web Services using WCF or if you’re really masochistic, you’ll use the WebClient to do http. In Silverlight, it’s fairly easy to wire up a WCF proxy to call a web service and get some data. In the XNA projects, not so much. Create WCF Service First, we’ll create our service that will return some information that we need in our game. Open Visual Studio 2010, and create a new WCF Web Service project. We’ll use the default implementation as we only need to see how to use a service, we are not interested in creating a really cool service at this point. However you may want to follow the instructions in the comments of Service1.svc.cs to change the name to something better, I used DataService and IDataService for the interface. You should now be able to run the project and the WCF Test Client will load and properly enumerate your service. At this point we have a functional service that can be consumed by our XNA game. Consume the WCF Service Open Visual Studio 2010 Express for Windows Phone and create a new XNA Game Studio 4.0 Windows Phone Game project. Now if you try to add a service reference to the project, you’ll notice that the option is not available. However, if you add a Silverlight application to your solution, you’ll notice that you can create a service reference there. So using the Silverlight project, we can create the service reference. Unfortunately you can’t reference the Silverlight project from the XNA Game project, so using Windows Explorer copy the Service References folder from the Silverlight project directory to the XNA Game project directory, then add the folder to your XNA Game project. You’ll need to set the property Build Action to None for all the files, except for Reference.cs, which should be Build. Truely, we only need Reference.cs but I find it easier to copy the whole folder. If you try to compile at this point, you’ll notice that we are missing  a couple of references, System.Runtime.Serialization, System.Net and System.ServiceModel. Add these to the XNA Game project and you should build successfully. You’ll also need to copy the ServiceReference.ClientConfig file and add it to your project. The WCF infrastructure looks for this file and will complain if it can’t find it. You’ll need to set the Copy to Output Directory property to Copy if Newer. We now need to add the code to call the service and display the results on the screen. Go ahead and add a SpriteFont resource to the Content project and load it in the Game project. There’s nothing here that’s changed much from 3.1 other than your Content project is now under the Solution node and not the Project node. While you’re at it, add a string field to store the result of the service call, and intialize it to string.Empty. Then in the Draw method, write the string out to the screen, only if it does not equal string.Empty. Now to wrap this up, lets create a new field that’s of the type DataServiceClient. In the Initialize Method, create a new instance of this type using its default contructor, then in the LoadContent we can call the service. Since we can only call the GetData method of our service asynchronously we need to set up a Completed event handler first. Thankfully, Visual Studio helps out a lot there just create, using the tab key whatever VS says to. In the GetDataAsyncCompleted event handler assign the service result ( e.Result) to your string field. If you run your game, you should get something like this : Enjoy!

    Read the article

  • Integrate Microsoft Translator into your ASP.Net application

    - by sreejukg
    In this article I am going to explain how easily you can integrate the Microsoft translator API to your ASP.Net application. Why we need a translation API? Once you published a website, you are opening a channel to the global audience. So making the web content available only in one language doesn’t cover all your audience. Especially when you are offering products/services it is important to provide contents in multiple languages. Users will be more comfortable when they see the content in their native language. How to achieve this, hiring translators and translate the content to all your user’s languages will cost you lot of money, and it is not a one time job, you need to translate the contents on the go. What is the alternative, we need to look for machine translation. Thankfully there are some translator engines available that gives you API level access, so that automatically you can translate the content and display to the user. Microsoft Translator API is an excellent set of web service APIs that allows developers to use the machine translation technology in their own applications. The Microsoft Translator API is offered through Windows Azure market place. In order to access the data services published in Windows Azure market place, you need to have an account. The registration process is simple, and it is common for all the services offered through the market place. Last year I had written an article about Bing Search API, where I covered the registration process. You can refer the article here. http://weblogs.asp.net/sreejukg/archive/2012/07/04/integrate-bing-search-api-to-asp-net-application.aspx Once you registered with Windows market place, you will get your APP ID. Now you can visit the Microsoft Translator page and click on the sign up button. http://datamarket.azure.com/dataset/bing/microsofttranslator As you can see, there are several options available for you to subscribe. There is a free version available, great. Click on the sign up button under the package that suits you. Clicking on the sign up button will bring the sign up form, where you need to agree on the terms and conditions and go ahead. You need to have a windows live account in order to sign up for any service available in Windows Azure market place. Once you signed up successfully, you will receive the thank you page. You can download the C# class library from here so that the integration can be made without writing much code. The C# file name is TranslatorContainer.cs. At any point of time, you can visit https://datamarket.azure.com/account/datasets to see the applications you are subscribed to. Click on the Use link next to each service will give you the details of the application. You need to not the primary account key and URL of the service to use in your application. Now let us start our ASP.Net project. I have created an empty ASP.Net web application using Visual Studio 2010 and named it Translator Sample, any name could work. By default, the web application in solution explorer looks as follows. Now right click the project and select Add -> Existing Item and then browse to the TranslatorContainer.cs. Now let us create a page where user enter some data and perform the translation. I have added a new web form to the project with name Translate.aspx. I have placed one textbox control for user to type the text to translate, the dropdown list to select the target language, a label to display the translated text and a button to perform the translation. For the dropdown list I have selected some languages supported by Microsoft translator. You can get all the supported languages with their codes from the below link. http://msdn.microsoft.com/en-us/library/hh456380.aspx The form looks as below in the design surface of Visual Studio. All the class libraries in the windows market place requires reference to System.Data.Services.Client, let us add the reference. You can find the documentation of how to use the downloaded class library from the below link. http://msdn.microsoft.com/en-us/library/gg312154.aspx Let us evaluate the translatorContainer.cs file. You can refer the code and it is self-explanatory. Note the namespace name used (Microsoft), you need to add the namespace reference to your page. I have added the following event for the translate button. The code is self-explanatory. You are creating an object of TranslatorContainer class by passing the translation service URL. Now you need to set credentials for your Translator container object, which will be your account key. The TranslatorContainer support a method that accept a text input, source language and destination language and returns DataServiceQuery<Translation>. Let us see this working, I just ran the application and entered Good Morning in the Textbox. Selected target language and see the output as follows. It is easy to build great translator applications using Microsoft translator API, and there is a reasonable amount of translation you can perform in your application for free. For enterprises, you can subscribe to the appropriate package and make your application multi-lingual.

    Read the article

  • Create Orchard Module in a Separate Project

    - by Steve Michelotti
    The Orchard Project is a new OOS Microsoft project that is being developed up on CodePlex. From the Orchard home page on CodePlex, it states “Orchard project is focused on delivering a .NET-based CMS application that will allow users to rapidly create content-driven Websites, and an extensibility framework that will allow developers and customizers to provide additional functionality through modules and themes.” The Orchard Project site contains additional information including documentation and walkthroughs. The ability to create a composite solution based on a collection of modules is a compelling feature. In Orchard, these modules can just be created as simple MVC Areas or they can also be created inside of stand-alone web application projects.  The walkthrough for writing an Orchard module that is available on the Orchard site uses a simple Area that is created inside of the host application. It is based on the Orchard MIX presentation. This walkthrough does an effective job introducing various Orchard concepts such as hooking into the navigation system, theme/layout system, content types, and more.  However, creating an Orchard module in a separate project does not seem to be concisely documented anywhere. Orchard ships with several module OOTB that are in separate assemblies – but again, it’s not well documented how to get started building one from scratch. The following are the steps I took to successfully get an Orchard module in a separate project up and running. Step 1 – Download the OrchardIIS.zip file from the Orchard Release page. Unzip and open up the solution. Step 2 – Add your project to the solution. I named my project “Orchard.Widget” and used and “MVC 2 Empty Web Application” project type. Make sure you put the physical path inside the “Modules” sub-folder to the main project like this: At this point the solution should look like: Step 3 – Add assembly references to Orchard.dll and Orchard.Core.dll. Step 4 – Add a controller and view.  I’ll just create a Hello World controller and view. Notice I created the view as a partial view (*.ascx). Also add the [Themed] attribute to the top of the HomeController class just like the normal Orchard walk through shows it. Step 5 – Add Module.txt to the project root. The is a very important step. Orchard will not recognize your module without this text file present.  It can contain just the name of your module: name: Widget Step 6 – Add Routes.cs. Notice I’ve given an area name of “Orchard.Widget” on lines 26 and 33. 1: using System; 2: using System.Collections.Generic; 3: using System.Web.Mvc; 4: using System.Web.Routing; 5: using Orchard.Mvc.Routes; 6:   7: namespace Orchard.Widget 8: { 9: public class Routes : IRouteProvider 10: { 11: public void GetRoutes(ICollection<RouteDescriptor> routes) 12: { 13: foreach (var routeDescriptor in GetRoutes()) 14: { 15: routes.Add(routeDescriptor); 16: } 17: } 18:   19: public IEnumerable<RouteDescriptor> GetRoutes() 20: { 21: return new[] { 22: new RouteDescriptor { 23: Route = new Route( 24: "Widget/{controller}/{action}/{id}", 25: new RouteValueDictionary { 26: {"area", "Orchard.Widget"}, 27: {"controller", "Home"}, 28: {"action", "Index"}, 29: {"id", ""} 30: }, 31: new RouteValueDictionary(), 32: new RouteValueDictionary { 33: {"area", "Orchard.Widget"} 34: }, 35: new MvcRouteHandler()) 36: } 37: }; 38: } 39: } 40: } Step 7 – Add MainMenu.cs. This will make sure that an item appears in the main menu called “Widget” which points to the module. 1: using System; 2: using Orchard.UI.Navigation; 3:   4: namespace Orchard.Widget 5: { 6: public class MainMenu : INavigationProvider 7: { 8: public void GetNavigation(NavigationBuilder builder) 9: { 10: builder.Add(menu => menu.Add("Widget", item => item.Action("Index", "Home", new 11: { 12: area = "Orchard.Widget" 13: }))); 14: } 15:   16: public string MenuName 17: { 18: get { return "main"; } 19: } 20: } 21: } Step 8 – Clean up web.config. By default Visual Studio adds numerous sections to the web.config. The sections that can be removed are: appSettings, connectionStrings, authentication, membership, profile, and roleManager. Step 9 – Delete Global.asax. This project will ultimately be running from inside the Orchard host so this “sub-site” should not have its own Global.asax.   Now you’re ready the run the app.  When you first run it, the “Widget” menu item will appear in the main menu because of the MainMenu.cs file we added: We can then click the “Widget” link in the main menu to send us over to our view:   Packaging From start to finish, it’s a relatively painless experience but it could be better. For example, a Visual Studio project template that encapsulates aspects from this blog post would definitely make it a lot easier to get up and running with creating an Orchard module.  Another aspect I found interesting is that if you read the first paragraph of the walkthrough, it says, “You can also develop modules as separate projects, to be packaged and shared with other users of Orchard CMS (the packaging story is still to be defined, along with marketplaces for sharing modules).” In particular, I will be extremely curious to see how the “packaging story” evolves. The first thing that comes to mind for me is: what if we explored MvcContrib Portable Areas as a potential mechanism for this packaging? This would certainly make things easy since all artifacts (aspx, aspx, images, css, javascript) are all wrapped up into a single assembly. Granted, Orchard does have its own infrastructure for layouts and themes but it seems like integrating portable areas into this pipeline would not be a difficult undertaking. Maybe that’ll be the next research task. :)

    Read the article

  • WPF Databinding- Part 2 of 3

    - by Shervin Shakibi
    This is a follow up to my previous post WPF Databinding- Not your fathers databinding Part 1-3 you can download the source code here  http://ssccinc.com/wpfdatabinding.zip Example 04   In this example we demonstrate  the use of default properties and also binding to an instant of an object which is part of a collection bound to its container. this is actually not as complicated as it sounds. First of all, lets take a look at our Employee class notice we have overridden the ToString method, which will return employees First name , last name and employee number in parentheses, public override string ToString()        {            return String.Format("{0} {1} ({2})", FirstName, LastName, EmployeeNumber);        }   in our XAML we have set the itemsource of the list box to just  “Binding” and the Grid that contains it, has its DataContext set to a collection of our Employee objects. DataContext="{StaticResource myEmployeeList}"> ….. <ListBox Name="employeeListBox"  ItemsSource="{Binding }" Grid.Row="0" /> the ToString in the method for each instance will get executed and the following is a result of it. if we did not have a ToString the list box would look  like this: now lets take a look at the grid that will display the details when someone clicks on an Item, the Grid has the following DataContext DataContext="{Binding ElementName=employeeListBox,            Path=SelectedItem}"> Which means its bound to a specific instance of the Employee object. and within the gird we have textboxes that are bound to different Properties of our class. <TextBox Grid.Row="0" Grid.Column="1" Text="{Binding Path=FirstName}" /> <TextBox Grid.Row="1" Grid.Column="1" Text="{Binding Path=LastName}" /> <TextBox Grid.Row="2" Grid.Column="1" Text="{Binding Path=Title}" /> <TextBox Grid.Row="3" Grid.Column="1" Text="{Binding Path=Department}" />   Example 05   This project demonstrates use of the ObservableCollection and INotifyPropertyChanged interface. Lets take a look at Employee.cs first, notice it implements the INotifyPropertyChanged interface now scroll down and notice for each setter there is a call to the OnPropertyChanged method, which basically will will fire up the event notifying to the value of that specific property has been changed. Next EmployeeList.cs notice it is an ObservableCollection . Go ahead and set the start up project to example 05 and then run. Click on Add a new employee and the new employee should appear in the list box.   Example 06   This is a great example of IValueConverter its actuall a two for one deal, like most of my presentation demos I found this by “Binging” ( formerly known as g---ing) unfortunately now I can’t find the original author to give him  the credit he/she deserves. Before we look at the code lets run the app and look at the finished product, put in 0 in Celsius  and you should see Fahrenheit textbox displaying to 32 degrees, I know this is calculating correctly from my elementary school science class , also note the color changed to blue, now put in 100 in Celsius which should give us 212 Fahrenheit but now the color is red indicating it is hot, and finally put in 75 Fahrenheit and you should see 23.88 for Celsius and the color now should be black. Basically IValueConverter allows us different types to be bound, I’m sure you have had problems in the past trying to bind to Date values . First look at FahrenheitToCelciusConverter.cs first notice it implements IValueConverter. IValueConverter has two methods Convert and ConvertBack. In each method we have the code for converting Fahrenheit to Celsius and vice Versa. In our XAML, after we set a reference in our Windows.Resources section. and for txtCelsius we set the path to TxtFahrenheit and the converter to an instance our FahrenheitToCelciusConverter converter. no need to repeat this for TxtFahrenheit since we have a convert and ConvertBack. Text="{Binding  UpdateSourceTrigger=PropertyChanged,            Path=Text,ElementName=txtFahrenheit,            Converter={StaticResource myTemperatureConverter}}" As mentioned earlier this is a twofer Demo, in the second demo, we basically are converting a double datatype to a brush. Lets take a look at TemperatureToColorConverter, notice we in our Covert Method, if the value is less than our cold temperature threshold we return a blue brush and if it is higher than our hot temperature threshold we return a redbrush. since we don’t have to convert a brush to double value in our example the convert back is not being implemented. Take time and go through these three examples and I hope you have a better understanding   of databinding, ObservableCollection  and IValueConverter . Next blog posting we will talk about ValidationRule, DataTemplates and DataTemplate triggers.

    Read the article

  • Composing Silverlight Applications With MEF

    - by PeterTweed
    Anyone who has written an application with complexity enough to warrant multiple controls on multiple pages/forms should understand the benefit of composite application development.  That is defining your application architecture that can be separated into separate pieces each with it’s own distinct purpose that can then be “composed” together into the solution. Composition can be useful in any layer of the application, from the presentation layer, the business layer, common services or data access.  Historically people have had different options to achieve composing applications from distinct well known pieces – their own version of dependency injection, containers to aid with composition like Unity, the composite application guidance for WPF and Silverlight and before that the composite application block. Microsoft has been working on another mechanism to aid composition and extension of applications for some time now – the Managed Extensibility Framework or MEF for short.  With Silverlight 4 it is part of the Silverlight environment.  MEF allows a much simplified mechanism for composition and extensibility compared to other mechanisms – which has always been the primary issue for adoption of the earlier mechanisms/frameworks. This post will guide you through the simple use of MEF for the scenario of composition of an application – using exports, imports and composition.  Steps: 1.     Create a new Silverlight 4 application. 2.     Add references to the following assemblies: System.ComponentModel.Composition.dll System.ComponentModel.Composition.Initialization.dll 3.     Add a new user control called LeftControl. 4.     Replace the LayoutRoot Grid with the following xaml:     <Grid x:Name="LayoutRoot" Background="Beige" Margin="40" >         <Button Content="Left Content" Margin="30"></Button>     </Grid> 5.     Add the following statement to the top of the LeftControl.xaml.cs file using System.ComponentModel.Composition; 6.     Add the following attribute to the LeftControl class     [Export(typeof(LeftControl))]   This attribute tells MEF that the type LeftControl will be exported – i.e. made available for other applications to import and compose into the application. 7.     Add a new user control called RightControl. 8.     Replace the LayoutRoot Grid with the following xaml:     <Grid x:Name="LayoutRoot" Background="Green" Margin="40"  >         <TextBlock Margin="40" Foreground="White" Text="Right Control" FontSize="16" VerticalAlignment="Center" HorizontalAlignment="Center" ></TextBlock>     </Grid> 9.     Add the following statement to the top of the RightControl.xaml.cs file using System.ComponentModel.Composition; 10.   Add the following attribute to the RightControl class     [Export(typeof(RightControl))] 11.   Add the following xaml to the LayoutRoot Grid in MainPage.xaml:         <StackPanel Orientation="Horizontal" HorizontalAlignment="Center">             <Border Name="LeftContent" Background="Red" BorderBrush="Gray" CornerRadius="20"></Border>             <Border Name="RightContent" Background="Red" BorderBrush="Gray" CornerRadius="20"></Border>         </StackPanel>   The borders will hold the controls that will be imported and composed via MEF. 12.   Add the following statement to the top of the MainPage.xaml.cs file using System.ComponentModel.Composition; 13.   Add the following properties to the MainPage class:         [Import(typeof(LeftControl))]         public LeftControl LeftUserControl { get; set; }         [Import(typeof(RightControl))]         public RightControl RightUserControl { get; set; }   This defines properties accepting LeftControl and RightControl types.  The attrributes are used to tell MEF the discovered type that should be applied to the property when composition occurs. 14.   Replace the MainPage constructore with the following code:         public MainPage()         {             InitializeComponent();             CompositionInitializer.SatisfyImports(this);             LeftContent.Child = LeftUserControl;             RightContent.Child = RightUserControl;         }   The CompositionInitializer.SatisfyImports(this) function call tells MEF to discover types related to the declared imports for this object (the MainPage object).  At that point, types matching those specified in the import defintions are discovered in the executing assembly location of the application and instantiated and assigned to the matching properties of the current object. 15.   Run the application and you will see the left control and right control types displayed in the MainPage:   Congratulations!  You have used MEF to dynamically compose user controls into a parent control in a composite application model. In the next post we will build on this topic to cover using MEF to compose Silverlight applications dynamically in download on demand scenarios – so .xap packages can be downloaded only when needed, avoiding large initial download for the main application xap. Take the Slalom Challenge at www.slalomchallenge.com!

    Read the article

  • career advice for PhD scientist seeking to program?

    - by C SD
    I'm largely a self-taught programmer. In fact, I first started programming about half way through biophysics grad school, and even though I think I've done some pretty nice work, I've never worked as part of a 'serious' development team that had more than one or two other developers (and I wouldn't hesitate to call them equally inexperienced in software development as a profession). After finishing my PhD I applied to Google, on a lark, since I had some confidence in my abilities, if not necessarily my experience, and I was hoping to maybe slip in and absorb all the experience and talent I'd be surrounded with and become productive enough, quickly enough, that they wouldn't immediately regret their decision. I was excited to actually get invited to interview up at Mountain View (this was ~ mid 2008). Overall, my memory of the interview was very positive, but after close to a three month wait (is that normal?) they ended up turning me down. I wasn't too surprised or disappointed (aside from the uncomfortably long wait) given my unusual background and admitted lack of experience. I decided to continue as a postdoc, but focus on improving my skills rather than doing research. I've done about three years of that, and my honest assessment is that I've learned a ton more, but I really need more of a peer group to maintain or accelerate my growth. Google invited me to interview again about eight months ago, and the interview process went even better than the first time around (I thought), though they again declined to give me an offer. I have to admit this second rejection was much more discouraging. They had insisted I interview even after I mentioned to them that a move on my part was unlikely given that I had bought a house, gotten married, etc. since the first interview. I guess I was hoping they'd at least give me an offer that I could parlay into a more conventional, but still interesting, programming position close to home. So here I am, going on my third year out of grad school, a glorified postdoc and I'm starting to get pretty discouraged. Even though I could technically get 'back-on-track' for a career in science, I have been focusing the vast majority of this time on gaining programming experience rather than on research and publications. The problem is, whenever I look, most job listings have requirements that seem impossibly grandiose and I hesitate to apply. That, or the job/project seems incredibly dull. Ironically, applying to Google struck me as less intimidating. I suspect that either most people are just a lot less realistic than I am when it comes to assessing how long it will take for them to get up to speed, or they don't care; my fear is that I'm just woefully unqualified for any interesting, well paying work. IE: I'm confident I could switch fully back into C++ mode with a couple weeks work (I mostly use C,Python,C# daily) but I don't list myself as being 'proficient' in C++ on my CV, or applying for jobs that 'require' such knowledge. The few applications for which I did feel I was a legitimately good match have not elicited a response. I suspect the following things are potential problems with my application/CV and I would like feedback on: I don't have a CS degree. My BS was in biochemistry and molecular biology, my PhD in biophysics. I took a undergrad and grad level CS course at UCSD and completely killed them, but I don't know how to translate that to my CV effectively. I have a PhD, but it's not in CS... I've been debating if I should remove it from my CV, and wether or not it would then be misleading to list at least some of those years as some kind of 'programming' job (in many respects it was). I think there are sometimes strong stigmas associated with 'self-taught' programmers. I am certainly one of those. I even recognize that some of those stigmas hold a hint of truth, but I really do want to be an asset to a team. How do I communicate that even though I have been largely self-directing for ~8 years I can still take marching orders when needed? Do I just say so outright? Should I just become a lot less scrupulous about the whole process? anecdote: I have a friend who applied for positions where he completely fudged his qualifications to get past the first culling. He was much more honest and forthcoming about his actual qualifications when contacted and he still managed to get invited to a couple of interviews and even got some offers. His balls are larger than mine though.

    Read the article

  • C#/.NET Little Wonders: Getting Caller Information

    - by James Michael Hare
    Originally posted on: http://geekswithblogs.net/BlackRabbitCoder/archive/2013/07/25/c.net-little-wonders-getting-caller-information.aspx Once again, in this series of posts I look at the parts of the .NET Framework that may seem trivial, but can help improve your code by making it easier to write and maintain. The index of all my past little wonders posts can be found here. There are times when it is desirable to know who called the method or property you are currently executing.  Some applications of this could include logging libraries, or possibly even something more advanced that may server up different objects depending on who called the method. In the past, we mostly relied on the System.Diagnostics namespace and its classes such as StackTrace and StackFrame to see who our caller was, but now in C# 5, we can also get much of this data at compile-time. Determining the caller using the stack One of the ways of doing this is to examine the call stack.  The classes that allow you to examine the call stack have been around for a long time and can give you a very deep view of the calling chain all the way back to the beginning for the thread that has called you. You can get caller information by either instantiating the StackTrace class (which will give you the complete stack trace, much like you see when an exception is generated), or by using StackFrame which gets a single frame of the stack trace.  Both involve examining the call stack, which is a non-trivial task, so care should be done not to do this in a performance-intensive situation. For our simple example let's say we are going to recreate the wheel and construct our own logging framework.  Perhaps we wish to create a simple method Log which will log the string-ified form of an object and some information about the caller.  We could easily do this as follows: 1: static void Log(object message) 2: { 3: // frame 1, true for source info 4: StackFrame frame = new StackFrame(1, true); 5: var method = frame.GetMethod(); 6: var fileName = frame.GetFileName(); 7: var lineNumber = frame.GetFileLineNumber(); 8: 9: // we'll just use a simple Console write for now 10: Console.WriteLine("{0}({1}):{2} - {3}", 11: fileName, lineNumber, method.Name, message); 12: } So, what we are doing here is grabbing the 2nd stack frame (the 1st is our current method) using a 2nd argument of true to specify we want source information (if available) and then taking the information from the frame.  This works fine, and if we tested it out by calling from a file such as this: 1: // File c:\projects\test\CallerInfo\CallerInfo.cs 2:  3: public class CallerInfo 4: { 5: Log("Hello Logger!"); 6: } We'd see this: 1: c:\projects\test\CallerInfo\CallerInfo.cs(5):Main - Hello Logger! This works well, and in fact CallStack and StackFrame are still the best ways to examine deeper into the call stack.  But if you only want to get information on the caller of your method, there is another option… Determining the caller at compile-time In C# 5 (.NET 4.5) they added some attributes that can be supplied to optional parameters on a method to receive caller information.  These attributes can only be applied to methods with optional parameters with explicit defaults.  Then, as the compiler determines who is calling your method with these attributes, it will fill in the values at compile-time. These are the currently supported attributes available in the  System.Runtime.CompilerServices namespace": CallerFilePathAttribute – The path and name of the file that is calling your method. CallerLineNumberAttribute – The line number in the file where your method is being called. CallerMemberName – The member that is calling your method. So let’s take a look at how our Log method would look using these attributes instead: 1: static int Log(object message, 2: [CallerMemberName] string memberName = "", 3: [CallerFilePath] string fileName = "", 4: [CallerLineNumber] int lineNumber = 0) 5: { 6: // we'll just use a simple Console write for now 7: Console.WriteLine("{0}({1}):{2} - {3}", 8: fileName, lineNumber, memberName, message); 9: } Again, calling this from our sample Main would give us the same result: 1: c:\projects\test\CallerInfo\CallerInfo.cs(5):Main - Hello Logger! However, though this seems the same, there are a few key differences. First of all, there are only 3 supported attributes (at this time) that give you the file path, line number, and calling member.  Thus, it does not give you as rich of detail as a StackFrame (which can give you the calling type as well and deeper frames, for example).  Also, these are supported through optional parameters, which means we could call our new Log method like this: 1: // They're defaults, why not fill 'em in 2: Log("My message.", "Some member", "Some file", -13); In addition, since these attributes require optional parameters, they cannot be used in properties, only in methods. These caveats aside, they do let you get similar information inside of methods at a much greater speed!  How much greater?  Well lets crank through 1,000,000 iterations of each.  instead of logging to console, I’ll return the formatted string length of each.  Doing this, we get: 1: Time for 1,000,000 iterations with StackTrace: 5096 ms 2: Time for 1,000,000 iterations with Attributes: 196 ms So you see, using the attributes is much, much faster!  Nearly 25x faster in fact.  Summary There are a few ways to get caller information for a method.  The StackFrame allows you to get a comprehensive set of information spanning the whole call stack, but at a heavier cost.  On the other hand, the attributes allow you to quickly get at caller information baked in at compile-time, but to do so you need to create optional parameters in your methods to support it. Technorati Tags: Little Wonders,CSharp,C#,.NET,StackFrame,CallStack,CallerFilePathAttribute,CallerLineNumberAttribute,CallerMemberName

    Read the article

  • Alternative way of developing for ASP.NET to WebForms - Any problems with this?

    - by John
    So I have been developing in ASP.NET WebForms for some time now but often get annoyed with all the overhead (like ViewState and all the JavaScript it generates), and the way WebForms takes over a lot of the HTML generation. Sometimes I just want full control over the markup and produce efficient HTML of my own so I have been experimenting with what I like to call HtmlForms. Essentially this is using ASP.NET WebForms but without the form runat="server" tag. Without this tag, ASP.NET does not seem to add anything to the page at all. From some basic tests it seems that it runs well and you still have the ability to use code-behind pages, and many ASP.NET controls such as repeaters. Of course without the form runat="server" many controls won't work. A post at Enterprise Software Development lists the controls that do require the tag. From that list you will see that all of the form elements like TextBoxes, DropDownLists, RadioButtons, etc cannot be used. Instead you use normal HTML form controls. But how do you access these HTML controls from the code behind? Retrieving values on post back is easy, you just use Request.QueryString or Request.Form. But passing data to the control could be a little messy. Do you use a ASP.NET Literal control in the value field or do you use <%= value % in the markup page? I found it best to add runat="server" to my HTML controls and then you can access the control in your code-behind like this: ((HtmlInputText)txtName).Value = "blah"; Here's a example that shows what you can do with a textbox and drop down list: Default.aspx <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="NoForm.Default" %> <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="NoForm.Default" %> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN""http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head runat="server"> <title></title> </head> <body> <form action="" method="post"> <label for="txtName">Name:</label> <input id="txtName" name="txtName" runat="server" /><br /> <label for="ddlState">State:</label> <select id="ddlState" name="ddlState" runat="server"> <option value=""></option> </select><br /> <input type="submit" value="Submit" /> </form> </body> </html> Default.aspx.cs using System; using System.Web.UI.HtmlControls; using System.Web.UI.WebControls; namespace NoForm { public partial class Default : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { //Default values string name = string.Empty; string state = string.Empty; if (Request.RequestType == "POST") { //If form submitted (post back) name = Request.Form["txtName"]; state = Request.Form["ddlState"]; //Server side form validation would go here //and actions to process form and redirect } ((HtmlInputText)txtName).Value = name; ((HtmlSelect)ddlState).Items.Add(new ListItem("ACT")); ((HtmlSelect)ddlState).Items.Add(new ListItem("NSW")); ((HtmlSelect)ddlState).Items.Add(new ListItem("NT")); ((HtmlSelect)ddlState).Items.Add(new ListItem("QLD")); ((HtmlSelect)ddlState).Items.Add(new ListItem("SA")); ((HtmlSelect)ddlState).Items.Add(new ListItem("TAS")); ((HtmlSelect)ddlState).Items.Add(new ListItem("VIC")); ((HtmlSelect)ddlState).Items.Add(new ListItem("WA")); if (((HtmlSelect)ddlState).Items.FindByValue(state) != null) ((HtmlSelect)ddlState).Value = state; } } } As you can see, you have similar functionality to ASP.NET server controls but more control over the final markup, and less overhead like ViewState and all the JavaScript ASP.NET adds. Interestingly you can also use HttpPostedFile to handle file uploads using your own input type="file" control (and necessary form enctype="multipart/form-data"). So my question is can you see any problems with this method, and any thoughts on it's usefulness? I have further details and tests on my blog.

    Read the article

  • Building a Repository Pattern against an EF 5 EDMX Model - Part 1

    - by Juan
    I am part of a year long plus project that is re-writing an existing application for a client.  We have decided to develop the project using Visual Studio 2012 and .NET 4.5.  The project will be using a number of technologies and patterns to include Entity Framework 5, WCF Services, and WPF for the client UI.This is my attempt at documenting some of the successes and failures that I will be coming across in the development of the application.In building the data access layer we have to access a database that has already been designed by a dedicated dba. The dba insists on using Stored Procedures which has made the use of EF a little more difficult.  He will not allow direct table access but we did manage to get him to allow us to use Views.  Since EF 5 does not have good support to do Code First with Stored Procedures, my option was to create a model (EDMX) against the existing database views.   I then had to go select each entity and map the Insert/Update/Delete functions to their respective stored procedure. The next step after I had completed mapping the stored procedures to the entities in the EDMX model was to figure out how to build a generic repository that would work well with Entity Framework 5.  After reading the blog posts below, I adopted much of their code with some changes to allow for the use of Ninject for dependency injection.http://www.tcscblog.com/2012/06/22/entity-framework-generic-repository/ http://www.tugberkugurlu.com/archive/generic-repository-pattern-entity-framework-asp-net-mvc-and-unit-testing-triangle IRepository.cs public interface IRepository : IDisposable where T : class { void Add(T entity); void Update(T entity, int id); T GetById(object key); IQueryable Query(Expression> predicate); IQueryable GetAll(); int SaveChanges(); int SaveChanges(bool validateEntities); } GenericRepository.cs public abstract class GenericRepository : IRepository where T : class { public abstract void Add(T entity); public abstract void Update(T entity, int id); public abstract T GetById(object key); public abstract IQueryable Query(Expression> predicate); public abstract IQueryable GetAll(); public int SaveChanges() { return SaveChanges(true); } public abstract int SaveChanges(bool validateEntities); public abstract void Dispose(); } One of the issues I ran into was trying to do an update. I kept receiving errors so I posted a question on Stack Overflow http://stackoverflow.com/questions/12585664/an-object-with-the-same-key-already-exists-in-the-objectstatemanager-the-object and came up with the following hack. If someone has a better way, please let me know. DbContextRepository.cs public class DbContextRepository : GenericRepository where T : class { protected DbContext Context; protected DbSet DbSet; public DbContextRepository(DbContext context) { if (context == null) throw new ArgumentException("context"); Context = context; DbSet = Context.Set(); } public override void Add(T entity) { if (entity == null) throw new ArgumentException("Cannot add a null entity."); DbSet.Add(entity); } public override void Update(T entity, int id) { if (entity == null) throw new ArgumentException("Cannot update a null entity."); var entry = Context.Entry(entity); if (entry.State == EntityState.Detached) { var attachedEntity = DbSet.Find(id); // Need to have access to key if (attachedEntity != null) { var attachedEntry = Context.Entry(attachedEntity); attachedEntry.CurrentValues.SetValues(entity); } else { entry.State = EntityState.Modified; // This should attach entity } } } public override T GetById(object key) { return DbSet.Find(key); } public override IQueryable Query(Expression> predicate) { return DbSet.Where(predicate); } public override IQueryable GetAll() { return Context.Set(); } public override int SaveChanges(bool validateEntities) { Context.Configuration.ValidateOnSaveEnabled = validateEntities; return Context.SaveChanges(); } #region IDisposable implementation public override void Dispose() { if (Context != null) { Context.Dispose(); GC.SuppressFinalize(this); } } #endregion IDisposable implementation } At this point I am able to start creating individual repositories that are needed and add a Unit of Work.  Stay tuned for the next installment in my path to creating a Repository Pattern against EF5.

    Read the article

  • Configuring UCM cache to check for external Content Server changes

    - by Martin Deh
    Recently, I was involved in a customer scenario where they were modifying the Content Server's contributor data files directly through Content Server.  This operation of course is completely supported.  However, since the contributor data file was modified through the "backdoor", a running WebCenter Spaces page, which also used the same data file, would not get the updates immediately.  This was due to two reasons.  The first reason is that the Spaces page was using Content Presenter to display the contents of the data file. The second reason is that the Spaces application was using the "cached" version of the data file.  Fortunately, there is a way to configure cache so backdoor changes can be picked up more quickly and automatically. First a brief overview of Content Presenter.  The Content Presenter task flow enables WebCenter Spaces users with Page-Edit permissions to precisely customize the selection and presentation of content in a WebCenter Spaces application.  With Content Presenter, you can select a single item of content, contents under a folder, a list of items, or query for content, and then select a Content Presenter based template to render the content on a page in a Spaces application.  In addition to displaying the folders and the files in a Content Server, Content Presenter integrates with Oracle Site Studio to allow you to create, access, edit, and display Site Studio contributor data files (Content Server Document) in either a Site Studio region template or in a custom Content Presenter display template.  More information about creating Content Presenter Display Template can be found in the OFM Developers Guide for WebCenter Portal. The easiest way to configure the cache is to modify the WebCenter Spaces Content Server service connection setting through Enterprise Manager.  From here, under the Cache Details, there is a section to set the Cache Invalidation Interval.  Basically, this enables the cache to be monitored by the cache "sweeper" utility.  The cache sweeper queries for changes in the Content Server, and then "marks" the object in cache as "dirty".  This causes the application in turn to get a new copy of the document from the Content Server that replaces the cached version.  By default the initial value for the Cache Invalidation Interval is set to 0 (minutes).  This basically means that the sweeper is OFF.  To turn the sweeper ON, just set a value (in minutes).  The mininal value that can be set is 2 (minutes): Just a note.  In some instances, once the value of the Cache Invalidation Interval has been set (and saved) in the Enterprise Manager UI, it becomes "sticky" and the interval value cannot be set back to 0.  The good news is that this value can also be updated throught a WLST command.   The WLST command to run is as follows: setJCRContentServerConnection(appName, name, [socketType, url, serverHost, serverPort, keystoreLocation, keystorePassword, privateKeyAlias, privateKeyPassword, webContextRoot, clientSecurityPolicy, cacheInvalidationInterval, binaryCacheMaxEntrySize, adminUsername, adminPassword, extAppId, timeout, isPrimary, server, applicationVersion]) One way to get the required information for executing the command is to use the listJCRContentServerConnections('webcenter',verbose=true) command.  For example, this is the sample output from the execution: ------------------ UCM ------------------ Connection Name: UCM Connection Type: JCR External Appliction ID: Timeout: (not set) CIS Socket Type: socket CIS Server Hostname: webcenter.oracle.local CIS Server Port: 4444 CIS Keystore Location: CIS Private Key Alias: CIS Web URL: Web Server Context Root: /cs Client Security Policy: Admin User Name: sysadmin Cache Invalidation Interval: 2 Binary Cache Maximum Entry Size: 1024 The Documents primary connection is "UCM" From this information, the completed  setJCRContentServerConnection would be: setJCRContentServerConnection(appName='webcenter',name='UCM', socketType='socket', serverHost='webcenter.oracle.local', serverPort='4444', webContextRoot='/cs', cacheInvalidationInterval='0', binaryCacheMaxEntrySize='1024',adminUsername='sysadmin',isPrimary=1) Note: The Spaces managed server must be restarted for the change to take effect. More information about using WLST for WebCenter can be found here. Once the sweeper is turned ON, only cache objects that have been changed will be invalidated.  To test this out, I will go through a simple scenario.  The first thing to do is configure the Content Server so it can monitor and report on events.  Log into the Content Server console application, and under the Administration menu item, select System Audit Information.  Note: If your console is using the left menu display option, the Administration link will be located there. Under the Tracing Sections Information, add in only "system" and "requestaudit" in the Active Sections.  Check Full Verbose Tracing, check Save, then click the Update button.  Once this is done, select the View Server Output menu option.  This will change the browser view to display the log.  This is all that is needed to configure the Content Server. For example, the following is the View Server Output with the cache invalidation interval set to 2(minutes) Note the time stamp: requestaudit/6 08.30 09:52:26.001  IdcServer-68    GET_FOLDER_HISTORY_REPORT [dUser=sysadmin][IsJava=1] 0.016933999955654144(secs) requestaudit/6 08.30 09:52:26.010  IdcServer-69    GET_FOLDER_HISTORY_REPORT [dUser=sysadmin][IsJava=1] 0.006134999915957451(secs) requestaudit/6 08.30 09:52:26.014  IdcServer-70    GET_DOCUMENT_HISTORY_REPORT [dUser=sysadmin][IsJava=1] 0.004271999932825565(secs) ... other trace info ... requestaudit/6 08.30 09:54:26.002  IdcServer-71    GET_FOLDER_HISTORY_REPORT [dUser=sysadmin][IsJava=1] 0.020323999226093292(secs) requestaudit/6 08.30 09:54:26.011  IdcServer-72    GET_FOLDER_HISTORY_REPORT [dUser=sysadmin][IsJava=1] 0.017928000539541245(secs) requestaudit/6 08.30 09:54:26.017  IdcServer-73    GET_DOCUMENT_HISTORY_REPORT [dUser=sysadmin][IsJava=1] 0.010185999795794487(secs) Now that the tracing logs are reporting correctly, the next step is set up the Spaces app to test the sweeper. I will use 2 different pages that will use Content Presenter task flows.  Each task flow will use a different custom Content Presenter display template, and will be assign 2 different contributor data files (document that will be in the cache).  The pages at run time appear as follows: Initially, when the Space pages containing the content is loaded in the browser for the first time, you can see the tracing information in the Content Server output viewer. requestaudit/6 08.30 11:51:12.030 IdcServer-129 CLEAR_SERVER_OUTPUT [dUser=weblogic] 0.029171999543905258(secs) requestaudit/6 08.30 11:51:12.101 IdcServer-130 GET_SERVER_OUTPUT [dUser=weblogic] 0.025721000507473946(secs) requestaudit/6 08.30 11:51:26.592 IdcServer-131 VCR_GET_DOCUMENT_BY_NAME [dID=919][dDocName=DF_UCMCACHETESTER][dDocTitle=DF_UCMCacheTester][dUser=weblogic][RevisionSelectionMethod=LatestReleased][IsJava=1] 0.21525299549102783(secs) requestaudit/6 08.30 11:51:27.117 IdcServer-132 VCR_GET_CONTENT_TYPES [dUser=sysadmin][IsJava=1] 0.5059549808502197(secs) requestaudit/6 08.30 11:51:27.146 IdcServer-133 VCR_GET_CONTENT_TYPE [dUser=sysadmin][IsJava=1] 0.03360399976372719(secs) requestaudit/6 08.30 11:51:27.169 IdcServer-134 VCR_GET_CONTENT_TYPE [dUser=sysadmin][IsJava=1] 0.008806000463664532(secs) requestaudit/6 08.30 11:51:27.204 IdcServer-135 VCR_GET_CONTENT_TYPE [dUser=sysadmin][IsJava=1] 0.013265999965369701(secs) requestaudit/6 08.30 11:51:27.384 IdcServer-136 VCR_GET_CONTENT_TYPE [dUser=sysadmin][IsJava=1] 0.18119299411773682(secs) requestaudit/6 08.30 11:51:27.533 IdcServer-137 VCR_GET_CONTENT_TYPE [dUser=sysadmin][IsJava=1] 0.1519480049610138(secs) requestaudit/6 08.30 11:51:27.634 IdcServer-138 VCR_GET_CONTENT_TYPE [dUser=sysadmin][IsJava=1] 0.10827399790287018(secs) requestaudit/6 08.30 11:51:27.687 IdcServer-139 VCR_GET_CONTENT_TYPE [dUser=sysadmin][IsJava=1] 0.059702999889850616(secs) requestaudit/6 08.30 11:51:28.271 IdcServer-140 GET_USER_PERMISSIONS [dUser=weblogic][IsJava=1] 0.006703000050038099(secs) requestaudit/6 08.30 11:51:28.285 IdcServer-141 GET_ENVIRONMENT [dUser=sysadmin][IsJava=1] 0.010893999598920345(secs) requestaudit/6 08.30 11:51:30.433 IdcServer-142 GET_SERVER_OUTPUT [dUser=weblogic] 0.017318999394774437(secs) requestaudit/6 08.30 11:51:41.837 IdcServer-143 VCR_GET_DOCUMENT_BY_NAME [dID=508][dDocName=113_ES][dDocTitle=Landing Home][dUser=weblogic][RevisionSelectionMethod=LatestReleased][IsJava=1] 0.15937699377536774(secs) requestaudit/6 08.30 11:51:42.781 IdcServer-144 GET_FILE [dID=326][dDocName=WEBCENTERORACL000315][dDocTitle=Duke][dUser=anonymous][RevisionSelectionMethod=LatestReleased][dSecurityGroup=Public][xCollectionID=0] 0.16288499534130096(secs) The highlighted sections show where the 2 data files DF_UCMCACHETESTER (P1 page) and 113_ES (P2 page) were called by the (Spaces) VCR connection to the Content Server. The most important line to notice is the VCR_GET_DOCUMENT_BY_NAME invocation.  On subsequent refreshes of these 2 pages, you will notice (after you refresh the Content Server's View Server Output) that there are no further traces of the same VCR_GET_DOCUMENT_BY_NAME invocations.  This is because the pages are getting the documents from the cache. The next step is to go through the "backdoor" and change one of the documents through the Content Server console.  This operation can be done by first locating the data file document, and from the Content Information page, select Edit Data File menu option.   This invokes the Site Studio Contributor, where the modifications can be made. Refreshing the Content Server View Server Output, the tracing displays the operations perform on the document.  requestaudit/6 08.30 11:56:59.972 IdcServer-255 SS_CHECKOUT_BY_NAME [dID=922][dDocName=DF_UCMCACHETESTER][dUser=weblogic][dSecurityGroup=Public] 0.05558200180530548(secs) requestaudit/6 08.30 11:57:00.065 IdcServer-256 SS_GET_CONTRIBUTOR_CONFIG [dID=922][dDocName=DF_UCMCACHETESTER][dDocTitle=DF_UCMCacheTester][dUser=weblogic][dSecurityGroup=Public][xCollectionID=0] 0.08632399886846542(secs) requestaudit/6 08.30 11:57:00.470 IdcServer-259 DOC_INFO_BY_NAME [dID=922][dDocName=DF_UCMCACHETESTER][dDocTitle=DF_UCMCacheTester][dUser=weblogic][dSecurityGroup=Public][xCollectionID=0] 0.02268899977207184(secs) requestaudit/6 08.30 11:57:10.177 IdcServer-264 GET_FOLDER_HISTORY_REPORT [dUser=sysadmin][IsJava=1] 0.007652000058442354(secs) requestaudit/6 08.30 11:57:10.181 IdcServer-263 GET_FOLDER_HISTORY_REPORT [dUser=sysadmin][IsJava=1] 0.01868399977684021(secs) requestaudit/6 08.30 11:57:10.187 IdcServer-265 GET_DOCUMENT_HISTORY_REPORT [dUser=sysadmin][IsJava=1] 0.009367000311613083(secs) (internal)/6 08.30 11:57:26.118 IdcServer-266 File to be removed: /oracle/app/admin/domains/webcenter/ucm/cs/vault/~temp/703253295.xml (internal)/6 08.30 11:57:26.121 IdcServer-266 File to be removed: /oracle/app/admin/domains/webcenter/ucm/cs/vault/~temp/703253295.xml requestaudit/6 08.30 11:57:26.122 IdcServer-266 SS_SET_ELEMENT_DATA [dID=923][dDocName=DF_UCMCACHETESTER][dDocTitle=DF_UCMCacheTester][dUser=weblogic][dSecurityGroup=Public][xCollectionID=0][StatusCode=0][StatusMessage=Successfully checked in content item 'DF_UCMCACHETESTER'.] 0.3765290081501007(secs) requestaudit/6 08.30 11:57:30.710 IdcServer-267 DOC_INFO_BY_NAME [dID=923][dDocName=DF_UCMCACHETESTER][dDocTitle=DF_UCMCacheTester][dUser=weblogic][dSecurityGroup=Public][xCollectionID=0] 0.07942699640989304(secs) requestaudit/6 08.30 11:57:30.733 IdcServer-268 SS_GET_CONTRIBUTOR_STRINGS [dUser=weblogic] 0.0044570001773536205(secs) After a few moments and refreshing the P1 page, the updates has been applied. Note: The refresh time may very, since the Cache Invalidation Interval (set to 2 minutes) is not determined by when changes happened.  The sweeper just runs every 2 minutes. Refreshing the Content Server View Server Output, the tracing displays the important information. requestaudit/6 08.30 11:59:10.171 IdcServer-270 GET_FOLDER_HISTORY_REPORT [dUser=sysadmin][IsJava=1] 0.00952600035816431(secs) requestaudit/6 08.30 11:59:10.179 IdcServer-271 GET_FOLDER_HISTORY_REPORT [dUser=sysadmin][IsJava=1] 0.011118999682366848(secs) requestaudit/6 08.30 11:59:10.182 IdcServer-272 GET_DOCUMENT_HISTORY_REPORT [dUser=sysadmin][IsJava=1] 0.007447000127285719(secs) requestaudit/6 08.30 11:59:16.885 IdcServer-273 VCR_GET_DOCUMENT_BY_NAME [dID=923][dDocName=DF_UCMCACHETESTER][dDocTitle=DF_UCMCacheTester][dUser=weblogic][RevisionSelectionMethod=LatestReleased][IsJava=1] 0.0786449983716011(secs) After the specifed interval time the sweeper is invoked, which is noted by the GET_ ... calls.  Since the history has noted the change, the next call is to the VCR_GET_DOCUMENT_BY_NAME to retrieve the new version of the (modifed) data file.  Navigating back to the P2 page, and viewing the server output, there are no further VCR_GET_DOCUMENT_BY_NAME to retrieve the data file.  This simply means that this data file was just retrieved from the cache.   Upon further review of the server output, we can see that there was only 1 request for the VCR_GET_DOCUMENT_BY_NAME: requestaudit/6 08.30 12:08:00.021 Audit Request Monitor Request Audit Report over the last 120 Seconds for server webcenteroraclelocal16200****  requestaudit/6 08.30 12:08:00.021 Audit Request Monitor -Num Requests 8 Errors 0 Reqs/sec. 0.06666944175958633 Avg. Latency (secs) 0.02762500010430813 Max Thread Count 2  requestaudit/6 08.30 12:08:00.021 Audit Request Monitor 1 Service VCR_GET_DOCUMENT_BY_NAME Total Elapsed Time (secs) 0.09200000017881393 Num requests 1 Num errors 0 Avg. Latency (secs) 0.09200000017881393  requestaudit/6 08.30 12:08:00.021 Audit Request Monitor 2 Service GET_PERSONALIZED_JAVASCRIPT Total Elapsed Time (secs) 0.054999999701976776 Num requests 1 Num errors 0 Avg. Latency (secs) 0.054999999701976776  requestaudit/6 08.30 12:08:00.021 Audit Request Monitor 3 Service GET_FOLDER_HISTORY_REPORT Total Elapsed Time (secs) 0.028999999165534973 Num requests 2 Num errors 0 Avg. Latency (secs) 0.014499999582767487  requestaudit/6 08.30 12:08:00.021 Audit Request Monitor 4 Service GET_SERVER_OUTPUT Total Elapsed Time (secs) 0.017999999225139618 Num requests 1 Num errors 0 Avg. Latency (secs) 0.017999999225139618  requestaudit/6 08.30 12:08:00.021 Audit Request Monitor 5 Service GET_FILE Total Elapsed Time (secs) 0.013000000268220901 Num requests 1 Num errors 0 Avg. Latency (secs) 0.013000000268220901  requestaudit/6 08.30 12:08:00.021 Audit Request Monitor ****End Audit Report*****  

    Read the article

< Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >