Search Results

Search found 145 results on 6 pages for 'endtime'.

Page 5/6 | < Previous Page | 1 2 3 4 5 6  | Next Page >

  • Date/Time formatting in .NET (Devexpress Gantt charts to show time rather than date)

    - by calico-cat
    I have some data about a day's events that I'm trying to visualise as a Gantt chart using Devexpress XtraCharts. Devexpress's example here shows the chart being populated by date. However, I'd like it to be populated by time to compare the events throughout one day. My X-axis is displaying correctly - done like so: ganttDiagram.AxisY.DateTimeMeasureUnit = DateTimeMeasurementUnit.Minute I have data with the correct time, however, the label on each series is showing the date (which are all the same, because it's all the same day!) Thus, instead of being a bar, all of them are just single points, with the label showing 31/03/2010 - 31/03/2010. Each series is created with the code below: s.Points.Add(New SeriesPoint("Machine", New DateTime() {ev.StartTime, ev.EndTime}))

    Read the article

  • Occasional Deadlock.

    - by Carl Robinson
    Im having trouble with a web application that will deadlock occasionally There are 3 queries involved. 2 are trying to update a table UPDATE AttendanceRoll SET ErrorFlag = 0 WHERE ContractID = @ContractID AND DATEPART(month,AttendanceDate) = DATEPART(month,@Month_Beginning) AND DATEPART(year,AttendanceDate) = DATEPART(year,@Month_Beginning) and one is trying to insert into the table INSERT INTO AttendanceRoll (AttendanceDate, ContractID, PersonID, StartTime, EndTime, Hours, AbsenceReason, UpdateCount, SplitShiftID, ModifiedBy, ModifiedDate) SELECT @P33, @P34, @P35, CONVERT(datetime,REPLACE( @P36 ,&apos;.&apos;,&apos;:&apos;)), CONVERT(datetime,REPLACE( @P37 ,&apos;.&apos;,&apos;:&apos;)), @P38, @P39, @P40, 1, @P41, GETDATE() The deadlock graph shows a kind of circular arangement of page locks and an exchange event and the 2 update queries have the same server process id. If anyone has any ideas about how I should go about solving this issue it would be most appreciated. I have the deadlock graph that I can post if anybody needs to see it. Thanks Carl R

    Read the article

  • PHP - XML Feed get print values

    - by danit
    Here is my feed: <entry> <id>http://api.visitmix.com/OData.svc/Sessions(guid'816995df-b09a-447a-9391-019512f643a0')</id> <title type="text">Building Web Applications with Microsoft SQL Azure</title> <summary type="text">SQL Azure provides a highly available and scalable relational database engine in the cloud. In this demo-intensive and interactive session, learn how to quickly build web applications with SQL Azure Databases and familiar web technologies. We demonstrate how you can quickly provision, build and populate a new SQL Azure database directly from your web browser. Also, see firsthand several new enhancements we are adding to SQL Azure based on the feedback we&#x2019;ve received from the community since launching the service earlier this year.</summary> <published>2010-01-25T00:00:00-05:00</published> <updated>2010-03-05T01:07:05-05:00</updated> <author> <name /> </author> <link rel="edit" title="Session" href="Sessions(guid'816995df-b09a-447a-9391-019512f643a0')" /> <link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/Speakers" type="application/atom+xml;type=feed" title="Speakers" href="Sessions(guid'816995df-b09a-447a-9391-019512f643a0')/Speakers"> <m:inline> <feed> <title type="text">Speakers</title> <id>http://api.visitmix.com/OData.svc/Sessions(guid'816995df-b09a-447a-9391-019512f643a0')/Speakers</id> <updated>2010-03-25T11:56:06Z</updated> <link rel="self" title="Speakers" href="Sessions(guid'816995df-b09a-447a-9391-019512f643a0')/Speakers" /> <entry> <id>http://api.visitmix.com/OData.svc/Speakers(guid'3395ee85-d994-423c-a726-76b60a896d2a')</id> <title type="text">David-Robinson</title> <summary type="text"></summary> <updated>2010-03-25T11:56:06Z</updated> <author> <name>David Robinson</name> </author> <link rel="edit-media" title="Speaker" href="Speakers(guid'3395ee85-d994-423c-a726-76b60a896d2a')/$value" /> <link rel="edit" title="Speaker" href="Speakers(guid'3395ee85-d994-423c-a726-76b60a896d2a')" /> <link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/Sessions" type="application/atom+xml;type=feed" title="Sessions" href="Speakers(guid'3395ee85-d994-423c-a726-76b60a896d2a')/Sessions" /> <category term="EventModel.Speaker" scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme" /> <content type="image/jpeg" src="http://live.visitmix.com/Content/images/speakers/lrg/default.jpg" /> <m:properties xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices"> <d:SpeakerID m:type="Edm.Guid">3395ee85-d994-423c-a726-76b60a896d2a</d:SpeakerID> <d:SpeakerFirstName>David</d:SpeakerFirstName> <d:SpeakerLastName>Robinson</d:SpeakerLastName> <d:LargeImage m:null="true"></d:LargeImage> <d:SmallImage m:null="true"></d:SmallImage> <d:Twitter m:null="true"></d:Twitter> </m:properties> </entry> </feed> </m:inline> </link> <link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/Tags" type="application/atom+xml;type=feed" title="Tags" href="Sessions(guid'816995df-b09a-447a-9391-019512f643a0')/Tags" /> <link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/Files" type="application/atom+xml;type=feed" title="Files" href="Sessions(guid'816995df-b09a-447a-9391-019512f643a0')/Files" /> <category term="EventModel.Session" scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme" /> <content type="application/xml"> <m:properties> <d:SessionID m:type="Edm.Guid">816995df-b09a-447a-9391-019512f643a0</d:SessionID> <d:Location>Breakers L</d:Location> <d:Type>Seminar</d:Type> <d:Code>SVC07</d:Code> <d:StartTime m:type="Edm.DateTime">2010-03-17T12:00:00</d:StartTime> <d:EndTime m:type="Edm.DateTime">2010-03-17T13:00:00</d:EndTime> <d:Slug>SVC07</d:Slug> <d:CreatedDate m:type="Edm.DateTime">2010-01-26T18:14:24.687</d:CreatedDate> <d:SourceID m:type="Edm.Guid">cddca9b7-6830-4d06-af93-5fd87afb67b0</d:SourceID> </m:properties> </content> </entry> I want to print the: Session Title (Building Web Applications with Microsoft SQL Azure) The Author (David Robinson) The Location (Breakers L) And display the speakers image (http://live.visitmix.com/Content/images/speakers/lrg/default.jpg) I presume I can use filegetcontents and then transform to simplexmlstring, but I dont know how to get the deeper items in I want, like Author, and image. Any chance of a bit of coding genius here?

    Read the article

  • RadControl DateTimePicker Selecting new time doesn't remove highlight from previous selection

    - by Jason Beck
    This is not browser specific - the behavior exists in Firefox and IE. The RadControl is being used within a User Control in a SiteFinity site. Very little customization has been done to the control. <telerik:RadDateTimePicker ID="RadDateTimePicker1" runat="server" MinDate="2010/1/1" Width="250px"> <ClientEvents></ClientEvents> <TimeView starttime="08:00:00" endtime="20:00:00" interval="02:00:00"></TimeView> <DateInput runat="server" ID="DateInput"></DateInput> </telerik:RadDateTimePicker> protected void Page_Load(object sender, EventArgs e) { if (!IsPostBack) { RadDateTimePicker1.MinDate = DateTime.Now; } }

    Read the article

  • iPhone OpenGL ES freezes for no reason

    - by KJ
    Hi, I'm quite new to iPhone OpenGL ES, and I'm really stuck. I was trying to implement shadow mapping on iPhone, and I allocated two 512*1024*32bit textures for the shadow map and the diffuse map respectively. The problem is that my application started to freeze and reboot the device after I added the shadow map allocation part to the code (so I guess the shadow map allocation is causing all this mess). It happens randomly, but mostly within 10 minutes. (sometimes within a few secs) And it only happens on the real iPhone device, not on the virtual device. I backtracked the problem by removing irrelevant code lines by lines and now my code is really simple, but it's still crashing (I mean, freezing). Could anybody please download my xcode project linked below and see what on earth is wrong? The code is really simple: http://www.tempfiles.net/download/201004/95922/CrashTest.html I would really appreciate if someone can help me. My iPhone is a 3GS and running on the OS version 3.1. Again, run the code and it'll take about 5 mins in average for the device to freeze and reboot. (Don't worry, it does no harm) It'll just display cyan screen before it freezes, but you'll be able to notice when it happens because the device will reboot soon, so please be patient. Just in case you can't reproduce the problem, please let me know. (That could possibly mean it's specifically my device that something's wrong with) Observation: The problem goes away when I change the size of the shadow map to 512*512. (but with the diffuse map still 512*1024) I'm desperate for help, thanks in advance! Just for the people's information who can't download the link, here is the OpenGL code: #import "GLView.h" #import <OpenGLES/ES2/glext.h> #import <QuartzCore/QuartzCore.h> @implementation GLView + (Class)layerClass { return [CAEAGLLayer class]; } - (id)initWithCoder: (NSCoder*)coder { if ((self = [super initWithCoder:coder])) { CAEAGLLayer* layer = (CAEAGLLayer*)self.layer; layer.opaque = YES; layer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool: NO], kEAGLDrawablePropertyRetainedBacking, kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat, nil]; displayLink_ = nil; context_ = [[EAGLContext alloc] initWithAPI: kEAGLRenderingAPIOpenGLES2]; if (!context_ || ![EAGLContext setCurrentContext: context_]) { [self release]; return nil; } glGenFramebuffers(1, &framebuffer_); glBindFramebuffer(GL_FRAMEBUFFER, framebuffer_); glViewport(0, 0, self.bounds.size.width, self.bounds.size.height); glGenRenderbuffers(1, &defaultColorBuffer_); glBindRenderbuffer(GL_RENDERBUFFER, defaultColorBuffer_); [context_ renderbufferStorage: GL_RENDERBUFFER fromDrawable: layer]; glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, defaultColorBuffer_); glGenTextures(1, &shadowColorBuffer_); glActiveTexture(GL_TEXTURE1); glBindTexture(GL_TEXTURE_2D, shadowColorBuffer_); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 512, 1024, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL); glGenTextures(1, &texture_); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, texture_); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 512, 1024, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL); } return self; } - (void)startAnimation { displayLink_ = [CADisplayLink displayLinkWithTarget: self selector: @selector(drawView:)]; [displayLink_ setFrameInterval: 1]; [displayLink_ addToRunLoop: [NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode]; } - (void)useDefaultBuffers { glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, defaultColorBuffer_); glClearColor(0.0, 0.8, 0.8, 1); glClear(GL_COLOR_BUFFER_BIT); } - (void)useShadowBuffers { glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, shadowColorBuffer_, 0); glClearColor(0, 0, 0, 0); glClear(GL_COLOR_BUFFER_BIT); } - (void)drawView: (id)sender { NSTimeInterval startTime = [NSDate timeIntervalSinceReferenceDate]; [EAGLContext setCurrentContext: context_]; [self useShadowBuffers]; [self useDefaultBuffers]; glBindRenderbuffer(GL_RENDERBUFFER, defaultColorBuffer_); [context_ presentRenderbuffer: GL_RENDERBUFFER]; NSTimeInterval endTime = [NSDate timeIntervalSinceReferenceDate]; NSLog(@"FPS : %.1f", 1 / (endTime - startTime)); } - (void)stopAnimation { [displayLink_ invalidate]; displayLink_ = nil; } - (void)dealloc { if (framebuffer_) glDeleteFramebuffers(1, &framebuffer_); if (defaultColorBuffer_) glDeleteRenderbuffers(1, &defaultColorBuffer_); if (shadowColorBuffer_) glDeleteTextures(1, &shadowColorBuffer_); glDeleteTextures(1, &texture_); if ([EAGLContext currentContext] == context_) [EAGLContext setCurrentContext: nil]; [context_ release]; context_ = nil; [super dealloc]; } @end

    Read the article

  • Convert a string of numbers to a NSTimeInterval

    - by culov
    I know I must be over-complicating this because it NSTimeInterval is just a double, but I just can't seem to get this done properly since I have had very little exposure to objective c. the scenario is as follows: The data im pulling into the app contains two values, startTime and endTime, which are the epoch times in milliseconds. The variables that I want to hold these values are NSTimeInterval *start; NSTimeInterval *end; I decided to store them as NSTimeIntervals but im thinking that maybe i ought to store them as doubles because theres no need for NSTimeIntervals since comparisons can just be done with a primitive. Either way, I'd like to know what I'm missing in the following step, where I try to convert from string to NSTimeInterval: tempString = [truckArray objectAtIndex:2]; tempDouble = [tempString doubleValue]; Now it's safely stored as a double, but I can't get the value into an NSTimeInterval. How should this be accomplished? Thanks

    Read the article

  • .ics Calendar File - Parsing Date Time - What is the time format?

    - by Josh
    I am coding in php, attempting to get the start\end dates and times for events. I am utilizing the following RegEx for parsing out the information: $pattern='/(?P<StartDate>[0-9]{8})T(?P<StartTime>[0-9]{6}) .+(?P<EndDate>[0-9]{8})T(?P<EndTime>[0-9]{6})/'; The sample event entry is here: BEGIN:VEVENT UID:34b09fd7-8e6e-4d56-86b0-445745b89d93 ORGANIZER;CN=*********:mailto:********* DTSTART;TZID="(GMT-06.00) Central Time (US & Canada)":20100413T130000 DTEND;TZID="(GMT-06.00) Central Time (US & Canada)":20100413T160000 STATUS:CONFIRMED CLASS:PRIVATE X-MICROSOFT-CDO-INTENDEDSTATUS:BUSY TRANSP:OPAQUE X-MICROSOFT-DISALLOW-COUNTER:TRUE DTSTAMP:20100414T140711Z SEQUENCE:0 END:VEVENT 20100413T130000 and 20100413T160000 are the start and end points. The dates are straight forward, however how do I interpret the time part? This event starts at one and ends at four.

    Read the article

  • mvc3 datatabels and ajax-beginform

    - by MIkCode
    im trying to send and ajax request and returning the result into a new table i debugged the req and i can confirm that evry thing is good except the VIEW the end result is an empty table instead of one row one more weird thing is if i page source i can see all the table result(more than the one that suppose to) this is the view: @model Fnx.Esb.ServiceMonitor.ViewModel.MainModels @{ ViewBag.Title = "MainSearch"; } @Html.EditorForModel() @{ AjaxOptions ajaxOpts = new AjaxOptions { UpdateTargetId = "MainTable", InsertionMode = InsertionMode.Replace, Url = Url.Action("queryData", "MainSearch"), }; } @using (Ajax.BeginForm(ajaxOpts)) { <div class="container"> <form action="#" method="post"> <div id="mainSearch"> @Html.EditorFor(x => x.MainSearchModel) </div> <br /> <br /> <br /> <br /> <div id="advancedSearch"> <div class="accordion" id="accordion2"> <div class="accordion-group"> <div class="accordion-heading"> <a class="accordion-toggle" data-toggle="collapse" data-parent="#accordion2" href="#collapseOne"> Advanced Search </a> </div> <div id="collapseOne" class="accordion-body collapse in"> <div class="accordion-inner"> @Html.EditorFor(x => x.AdvanceSearchContainerModel) </div> </div> </div> </div> </div> <br /> <br /> <button type="submit" class="btn"> <i class="icon-search"></i> Search </button> <button type="reset" class="btn"> <i class="icon-trash"></i> clear </button> </form> <br /> <br /> <br /> <br /> <br /> <br /> <table id="MainTable" cellpadding="0" cellspacing="0" border="0" class="table table-striped table-bordered"> <thead> <tr> <th> serviceDuration </th> <th> status </th> <th> ESBLatency </th> <th> serviceName </th> <th> serviceId </th> <th> startTime </th> <th> endTime </th> <th> instanceID </th> </tr> </thead> <tbody> @foreach (var item in Model.MainTableModel) { <tr> <td> @Html.DisplayFor(modelItem => item.serviceDuration) </td> <td> @Html.DisplayFor(modelItem => item.status) </td> <td> @Html.DisplayFor(modelItem => item.ESBLatency) </td> <td> @Html.DisplayFor(modelItem => item.serviceName) </td> <td> @Html.DisplayFor(modelItem => item.serviceId) </td> <td> @Html.DisplayFor(modelItem => item.startTime) </td> <td> @Html.DisplayFor(modelItem => item.endTime) </td> <td> @Html.DisplayFor(modelItem => item.instanceID) </td> </tr> } </tbody> </table> </div> } the datatables: javascript options $('#MainTable').dataTable({ "sDom": "<'row'<'span6'l><'span6'f>r>t<'row'<'span6'i><'span6'p>>", "bDestroy": true }); thanks miki

    Read the article

  • SYSDATE - 1 error on pl/sql function

    - by ayo
    Hi curtisk/all, I have an issue: when i issue this function below ti gives me the following error: select 'EXECUTE DBMS_LOGMNR.ADD_LOGFILE(LOGFILENAME =>'''||name||'''||,OPTIONS=>DBMS_LOGMNR.NEW);' from v\$archived_log where name is not null; select 'EXECUTE DBMS_LOGMNR.ADD_LOGFILE(LOGFILENAME =>'''||name||'''||,OPTIONS=>DBMS_LOGMNR.ADDFILE);' from v\$archived_log where name is not null; EXECUTE DBMS_LOGMNR.START_LOGMNR( STARTTIME => SYSDATE - 1, ENDTIME => SYSDATE, OPTIONS => DBMS_LOGMNR.DICT_FROM_ONLINE_CATALOG + DBMS_LOGMNR.CONTINUOUS_MINE + DBMS_LOGMNR.COMMITTED_DATA_ONLY + DBMS_LOGMNR.PRINT_PRETTY_SQL); Error: * ERROR at line 1: ORA-01291: missing logfile ORA-06512: at "SYS.DBMS_LOGMNR", line 58 ORA-06512: at line 1 But i have added all the archived logs for several days before and my sysdate is at today. Kindly help out on this issue. thanks. Reagrds Ayo

    Read the article

  • Getting a value from a row at particular time

    - by Swetha Bindu
    I had a row in my database: starttime:4/6/2012 2:00pm, Endtime:31/12/9999, name:"swetha", status:"open"..... When I update this row I changed the starttime to the current time (getdate()) and have no issues. I am running a Windows Service each day at 1am to modify a value in the row. I would like to know the status of my row at 4/6/2012 11:59 pm when my service runs. There is no need to do an update at 4/6/2012 11:59 pm and the last update may be at any time of the day however my requirement is to get the status value at 4/6/2012 11:59 pm. I would like to have the query in SQL Server 2008. Can anyone please help me to find a solution?

    Read the article

  • How to use the Response from PHP in Objective-C?

    - by iMohammad
    I've managed to post and get datas from PHP thru objective-c. However, the response from the server looks like the following: ["ARC101","ARC112","ARC124","ARC203","ARC222","ARC251","ARC281","ARC305","ARC314","ARC344","ARC353","ARC363","ARC408","ARC416","ARC426","ARC482"] And, some responses could be more complicated like the following: [{"Code":"ARC101","Title":"Design Studio-I: Design Princi","Instructor":"Mike Cohen","Activity":"LAB","Days":"MW","Room":"B01","Bldg":"19","Section":"1","StartTime":"1310","EndTime":"1550","CallNo":"20438","Priority":null,"Open":"Open","HasAct":false,"CodeAct":null,"TitleAct":null,"InstructorAct":null] What would be your suggestion if I want to play with these datas? for example, putting them on a table ,etc. Thanks in Advance!

    Read the article

  • problem in playing next song in the avaudioplayer

    - by Rajashekar
    Hello friends my delegate method looks like this. after the first song is played it goes into this method and plays the second song , however when the second song is done playing it stops. it does not go into the delegate method.i need to play all the songs continuously. i am not sure, why. can someone help me. (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)p successfully:(BOOL)flag { if (flag == NO) NSLog(@"Playback finished unsuccessfully"); else { //[player stop]; index++; NSLog(@"%d",index); path=[[NSBundle mainBundle] pathForResource:[songlist objectAtIndex:index] ofType:@"mp3"]; [player initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL]; [songlabel2 setTitle:[songlist objectAtIndex:index]]; [endtime setText:[NSString stringWithFormat:@"%.2f",[player duration]/100]]; [player play]; } }

    Read the article

  • Date range advanced count calculation in TSQL

    - by cihata87
    I am working on call center project and I have to calculate the call arrivals at the same time between specific time ranges. I have to write a procedure which has parameters StartTime, EndTime and Interval For Example: Start Time: 11:00 End Time: 12:00 Interval: 20 minutes so program should divide the 1-hour time range into 3 parts and each part should count the arrivals which started and finished in this range OR arrivals which started and haven't finished yet Should be like this: 11:00 - 11:20 15 calls at the same time(TimePeaks) 11:20 - 11:40 21 calls ... 11:40 - 12:00 8 calls ... Any suggestions how to calculate them?

    Read the article

  • Bacula virtual backup job doesn't run, no output?

    - by Zoredache
    I am trying to get Virtual Backups working, but when I try to run a virtual backup job, it appears to get created, but then never seems to actually run. I have a full, and a couple incremental backups. status director JobId Level Files Bytes Status Finished Name ==================================================================== 1283 Full 10,565 1.963 G OK 21-Dec-12 09:47 nms-Job 1284 Incr 314 129.6 M OK 21-Dec-12 09:49 nms-Job 1285 Incr 230 147.2 M OK 21-Dec-12 09:51 nms-Job 1288 Incr 525 138.8 M OK 21-Dec-12 11:25 nms-Job I attempt to start a job from bconsole like this. *run job=nms-Job level=VirtualFull Using Catalog "MySQL" Run Backup job JobName: nms-Job Level: VirtualFull Client: nms-FileDaemon FileSet: nms-FileSet Pool: nms-pool (From Job resource) Storage: File_d1 (From Pool resource) When: 2012-12-21 13:07:54 Priority: 10 OK to run? (yes/mod/no): Job queued. JobId=1291 Then my new job, just sits there, doing nothing. The JobStatus shows that the job was created, but it appears to never run? All the full, and incremental backups are terminating normally. *llist jobid=1291 JobId: 1,291 Job: nms-Job.2012-12-21_13.07.56_07 Name: nms-Job PurgedFiles: 0 Type: B Level: F ClientId: 4 Name: nms-FileDaemon JobStatus: C SchedTime: 2012-12-21 13:07:54 StartTime: 2012-12-21 13:07:56 EndTime: 0000-00-00 00:00:00 RealEndTime: 0000-00-00 00:00:00 JobTDate: 1,356,124,076 VolSessionId: 0 VolSessionTime: 0 JobFiles: 0 JobErrors: 0 JobMissingFiles: 0 PoolId: 19 PooLname: nms-pool PriorJobId: 0 FileSetId: 11 FileSet: nms-FileSet I am getting very frustrated, that this isn't working, mostly because it isn't giving me any error logs, or output at all. I submit the job, and as far as I can tell nothing happens. Is there some status, or debugging level that I can set to get a useful information about why this isn't working? What can I do to make this work? I was originally running Bacula 5.0.2 on Debian Squeeze, out of frustration, I upgraded to the 5.2.6 in the backports repository, hoping that a new version might give me better results.

    Read the article

  • Questions related to writing your own file downloader using multiple threads java

    - by Shekhar
    Hello In my current company, i am doing a PoC on how we can write a file downloader utility. We have to use socket programming(TCP/IP) for downloading the files. One of the requirements of the client is that a file(which will be large in size) should be transfered in chunks for example if we have a file of 5Mb size then we can have 5 threads which transfer 1 Mb each. I have written a small application which downloads a file. You can download the eclipe project from http://www.fileflyer.com/view/QM1JSC0 A brief explanation of my classes FileSender.java This class provides the bytes of file. It has a method called sendBytesOfFile(long start,long end, long sequenceNo) which gives the number of bytes. import java.io.File; import java.io.IOException; import java.util.zip.CRC32; import org.apache.commons.io.FileUtils; public class FileSender { private static final String FILE_NAME = "C:\\shared\\test.pdf"; public ByteArrayWrapper sendBytesOfFile(long start,long end, long sequenceNo){ try { File file = new File(FILE_NAME); byte[] fileBytes = FileUtils.readFileToByteArray(file); System.out.println("Size of file is " +fileBytes.length); System.out.println(); System.out.println("Start "+start +" end "+end); byte[] bytes = getByteArray(fileBytes, start, end); ByteArrayWrapper wrapper = new ByteArrayWrapper(bytes, sequenceNo); return wrapper; } catch (IOException e) { throw new RuntimeException(e); } } private byte[] getByteArray(byte[] bytes, long start, long end){ long arrayLength = end-start; System.out.println("Start : "+start +" end : "+end + " Arraylength : "+arrayLength +" length of source array : "+bytes.length); byte[] arr = new byte[(int)arrayLength]; for(int i = (int)start, j =0; i < end;i++,j++){ arr[j] = bytes[i]; } return arr; } public static long fileSize(){ File file = new File(FILE_NAME); return file.length(); } } Second Class is FileReceiver.java - This class receives the file. Small Explanation what this file does This class finds the size of the file to be fetched from Sender Depending upon the size of the file it finds the start and end position till the bytes needs to be read. It starts n number of threads giving each thread start,end, sequence number and a list which all the threads share. Each thread reads the number of bytes and creates a ByteArrayWrapper. ByteArrayWrapper objects are added to the list Then i have while loop which basically make sure that all threads have done their work finally it sorts the list based on the sequence number. then the bytes are joined, and a complete byte array is formed which is converted to a file. Code of File Receiver package com.filedownloader; import java.io.File; import java.io.IOException; import java.util.ArrayList; import java.util.Collections; import java.util.Comparator; import java.util.List; import java.util.zip.CRC32; import org.apache.commons.io.FileUtils; public class FileReceiver { public static void main(String[] args) { FileReceiver receiver = new FileReceiver(); receiver.receiveFile(); } public void receiveFile(){ long startTime = System.currentTimeMillis(); long numberOfThreads = 10; long filesize = FileSender.fileSize(); System.out.println("File size received "+filesize); long start = filesize/numberOfThreads; List<ByteArrayWrapper> list = new ArrayList<ByteArrayWrapper>(); for(long threadCount =0; threadCount<numberOfThreads ;threadCount++){ FileDownloaderTask task = new FileDownloaderTask(threadCount*start,(threadCount+1)*start,threadCount,list); new Thread(task).start(); } while(list.size() != numberOfThreads){ // this is done so that all the threads should complete their work before processing further. //System.out.println("Waiting for threads to complete. List size "+list.size()); } if(list.size() == numberOfThreads){ System.out.println("All bytes received "+list); Collections.sort(list, new Comparator<ByteArrayWrapper>() { @Override public int compare(ByteArrayWrapper o1, ByteArrayWrapper o2) { long sequence1 = o1.getSequence(); long sequence2 = o2.getSequence(); if(sequence1 < sequence2){ return -1; }else if(sequence1 > sequence2){ return 1; } else{ return 0; } } }); byte[] totalBytes = list.get(0).getBytes(); byte[] firstArr = null; byte[] secondArr = null; for(int i = 1;i<list.size();i++){ firstArr = totalBytes; secondArr = list.get(i).getBytes(); totalBytes = concat(firstArr, secondArr); } System.out.println(totalBytes.length); convertToFile(totalBytes,"c:\\tmp\\test.pdf"); long endTime = System.currentTimeMillis(); System.out.println("Total time taken with "+numberOfThreads +" threads is "+(endTime-startTime)+" ms" ); } } private byte[] concat(byte[] A, byte[] B) { byte[] C= new byte[A.length+B.length]; System.arraycopy(A, 0, C, 0, A.length); System.arraycopy(B, 0, C, A.length, B.length); return C; } private void convertToFile(byte[] totalBytes,String name) { try { FileUtils.writeByteArrayToFile(new File(name), totalBytes); } catch (IOException e) { throw new RuntimeException(e); } } } Code of ByteArrayWrapper package com.filedownloader; import java.io.Serializable; public class ByteArrayWrapper implements Serializable{ private static final long serialVersionUID = 3499562855188457886L; private byte[] bytes; private long sequence; public ByteArrayWrapper(byte[] bytes, long sequenceNo) { this.bytes = bytes; this.sequence = sequenceNo; } public byte[] getBytes() { return bytes; } public long getSequence() { return sequence; } } Code of FileDownloaderTask import java.util.List; public class FileDownloaderTask implements Runnable { private List<ByteArrayWrapper> list; private long start; private long end; private long sequenceNo; public FileDownloaderTask(long start,long end,long sequenceNo,List<ByteArrayWrapper> list) { this.list = list; this.start = start; this.end = end; this.sequenceNo = sequenceNo; } @Override public void run() { ByteArrayWrapper wrapper = new FileSender().sendBytesOfFile(start, end, sequenceNo); list.add(wrapper); } } Questions related to this code 1) Does file downloading becomes fast when multiple threads is used? In this code i am not able to see the benefit. 2) How should i decide how many threads should i create ? 3) Are their any opensource libraries which does that 4) The file which file receiver receives is valid and not corrupted but checksum (i used FileUtils of common-io) does not match. Whats the problem? 5) This code gives out of memory when used with large file(above 100 Mb) i.e. because byte array which is created. How can i avoid? I know this is a very bad code but i have to write this in one day -:). Please suggest any other good way to do this? Thanks Shekhar

    Read the article

  • Traditional IO vs memory-mapped

    - by Senne
    I'm trying to illustrate the difference in performance between traditional IO and memory mapped files in java to students. I found an example somewhere on internet but not everything is clear to me, I don't even think all steps are nececery. I read a lot about it here and there but I'm not convinced about a correct implementation of neither of them. The code I try to understand is: public class FileCopy{ public static void main(String args[]){ if (args.length < 1){ System.out.println(" Wrong usage!"); System.out.println(" Correct usage is : java FileCopy <large file with full path>"); System.exit(0); } String inFileName = args[0]; File inFile = new File(inFileName); if (inFile.exists() != true){ System.out.println(inFileName + " does not exist!"); System.exit(0); } try{ new FileCopy().memoryMappedCopy(inFileName, inFileName+".new" ); new FileCopy().customBufferedCopy(inFileName, inFileName+".new1"); }catch(FileNotFoundException fne){ fne.printStackTrace(); }catch(IOException ioe){ ioe.printStackTrace(); }catch (Exception e){ e.printStackTrace(); } } public void memoryMappedCopy(String fromFile, String toFile ) throws Exception{ long timeIn = new Date().getTime(); // read input file RandomAccessFile rafIn = new RandomAccessFile(fromFile, "rw"); FileChannel fcIn = rafIn.getChannel(); ByteBuffer byteBuffIn = fcIn.map(FileChannel.MapMode.READ_WRITE, 0,(int) fcIn.size()); fcIn.read(byteBuffIn); byteBuffIn.flip(); RandomAccessFile rafOut = new RandomAccessFile(toFile, "rw"); FileChannel fcOut = rafOut.getChannel(); ByteBuffer writeMap = fcOut.map(FileChannel.MapMode.READ_WRITE,0,(int) fcIn.size()); writeMap.put(byteBuffIn); long timeOut = new Date().getTime(); System.out.println("Memory mapped copy Time for a file of size :" + (int) fcIn.size() +" is "+(timeOut-timeIn)); fcOut.close(); fcIn.close(); } static final int CHUNK_SIZE = 100000; static final char[] inChars = new char[CHUNK_SIZE]; public static void customBufferedCopy(String fromFile, String toFile) throws IOException{ long timeIn = new Date().getTime(); Reader in = new FileReader(fromFile); Writer out = new FileWriter(toFile); while (true) { synchronized (inChars) { int amountRead = in.read(inChars); if (amountRead == -1) { break; } out.write(inChars, 0, amountRead); } } long timeOut = new Date().getTime(); System.out.println("Custom buffered copy Time for a file of size :" + (int) new File(fromFile).length() +" is "+(timeOut-timeIn)); in.close(); out.close(); } } When exactly is it nececary to use RandomAccessFile? Here it is used to read and write in the memoryMappedCopy, is it actually nececary just to copy a file at all? Or is it a part of memorry mapping? In customBufferedCopy, why is synchronized used here? I also found a different example that -should- test the performance between the 2: public class MappedIO { private static int numOfInts = 4000000; private static int numOfUbuffInts = 200000; private abstract static class Tester { private String name; public Tester(String name) { this.name = name; } public long runTest() { System.out.print(name + ": "); try { long startTime = System.currentTimeMillis(); test(); long endTime = System.currentTimeMillis(); return (endTime - startTime); } catch (IOException e) { throw new RuntimeException(e); } } public abstract void test() throws IOException; } private static Tester[] tests = { new Tester("Stream Write") { public void test() throws IOException { DataOutputStream dos = new DataOutputStream( new BufferedOutputStream( new FileOutputStream(new File("temp.tmp")))); for(int i = 0; i < numOfInts; i++) dos.writeInt(i); dos.close(); } }, new Tester("Mapped Write") { public void test() throws IOException { FileChannel fc = new RandomAccessFile("temp.tmp", "rw") .getChannel(); IntBuffer ib = fc.map( FileChannel.MapMode.READ_WRITE, 0, fc.size()) .asIntBuffer(); for(int i = 0; i < numOfInts; i++) ib.put(i); fc.close(); } }, new Tester("Stream Read") { public void test() throws IOException { DataInputStream dis = new DataInputStream( new BufferedInputStream( new FileInputStream("temp.tmp"))); for(int i = 0; i < numOfInts; i++) dis.readInt(); dis.close(); } }, new Tester("Mapped Read") { public void test() throws IOException { FileChannel fc = new FileInputStream( new File("temp.tmp")).getChannel(); IntBuffer ib = fc.map( FileChannel.MapMode.READ_ONLY, 0, fc.size()) .asIntBuffer(); while(ib.hasRemaining()) ib.get(); fc.close(); } }, new Tester("Stream Read/Write") { public void test() throws IOException { RandomAccessFile raf = new RandomAccessFile( new File("temp.tmp"), "rw"); raf.writeInt(1); for(int i = 0; i < numOfUbuffInts; i++) { raf.seek(raf.length() - 4); raf.writeInt(raf.readInt()); } raf.close(); } }, new Tester("Mapped Read/Write") { public void test() throws IOException { FileChannel fc = new RandomAccessFile( new File("temp.tmp"), "rw").getChannel(); IntBuffer ib = fc.map( FileChannel.MapMode.READ_WRITE, 0, fc.size()) .asIntBuffer(); ib.put(0); for(int i = 1; i < numOfUbuffInts; i++) ib.put(ib.get(i - 1)); fc.close(); } } }; public static void main(String[] args) { for(int i = 0; i < tests.length; i++) System.out.println(tests[i].runTest()); } } I more or less see whats going on, my output looks like this: Stream Write: 653 Mapped Write: 51 Stream Read: 651 Mapped Read: 40 Stream Read/Write: 14481 Mapped Read/Write: 6 What is makeing the Stream Read/Write so unbelievably long? And as a read/write test, to me it looks a bit pointless to read the same integer over and over (if I understand well what's going on in the Stream Read/Write) Wouldn't it be better to read int's from the previously written file and just read and write ints on the same place? Is there a better way to illustrate it? I've been breaking my head about a lot of these things for a while and I just can't get the whole picture..

    Read the article

  • 256 Windows Azure Worker Roles, Windows Kinect and a 90's Text-Based Ray-Tracer

    - by Alan Smith
    For a couple of years I have been demoing a simple render farm hosted in Windows Azure using worker roles and the Azure Storage service. At the start of the presentation I deploy an Azure application that uses 16 worker roles to render a 1,500 frame 3D ray-traced animation. At the end of the presentation, when the animation was complete, I would play the animation delete the Azure deployment. The standing joke with the audience was that it was that it was a “$2 demo”, as the compute charges for running the 16 instances for an hour was $1.92, factor in the bandwidth charges and it’s a couple of dollars. The point of the demo is that it highlights one of the great benefits of cloud computing, you pay for what you use, and if you need massive compute power for a short period of time using Windows Azure can work out very cost effective. The “$2 demo” was great for presenting at user groups and conferences in that it could be deployed to Azure, used to render an animation, and then removed in a one hour session. I have always had the idea of doing something a bit more impressive with the demo, and scaling it from a “$2 demo” to a “$30 demo”. The challenge was to create a visually appealing animation in high definition format and keep the demo time down to one hour.  This article will take a run through how I achieved this. Ray Tracing Ray tracing, a technique for generating high quality photorealistic images, gained popularity in the 90’s with companies like Pixar creating feature length computer animations, and also the emergence of shareware text-based ray tracers that could run on a home PC. In order to render a ray traced image, the ray of light that would pass from the view point must be tracked until it intersects with an object. At the intersection, the color, reflectiveness, transparency, and refractive index of the object are used to calculate if the ray will be reflected or refracted. Each pixel may require thousands of calculations to determine what color it will be in the rendered image. Pin-Board Toys Having very little artistic talent and a basic understanding of maths I decided to focus on an animation that could be modeled fairly easily and would look visually impressive. I’ve always liked the pin-board desktop toys that become popular in the 80’s and when I was working as a 3D animator back in the 90’s I always had the idea of creating a 3D ray-traced animation of a pin-board, but never found the energy to do it. Even if I had a go at it, the render time to produce an animation that would look respectable on a 486 would have been measured in months. PolyRay Back in 1995 I landed my first real job, after spending three years being a beach-ski-climbing-paragliding-bum, and was employed to create 3D ray-traced animations for a CD-ROM that school kids would use to learn physics. I had got into the strange and wonderful world of text-based ray tracing, and was using a shareware ray-tracer called PolyRay. PolyRay takes a text file describing a scene as input and, after a few hours processing on a 486, produced a high quality ray-traced image. The following is an example of a basic PolyRay scene file. background Midnight_Blue   static define matte surface { ambient 0.1 diffuse 0.7 } define matte_white texture { matte { color white } } define matte_black texture { matte { color dark_slate_gray } } define position_cylindrical 3 define lookup_sawtooth 1 define light_wood <0.6, 0.24, 0.1> define median_wood <0.3, 0.12, 0.03> define dark_wood <0.05, 0.01, 0.005>     define wooden texture { noise surface { ambient 0.2  diffuse 0.7  specular white, 0.5 microfacet Reitz 10 position_fn position_cylindrical position_scale 1  lookup_fn lookup_sawtooth octaves 1 turbulence 1 color_map( [0.0, 0.2, light_wood, light_wood] [0.2, 0.3, light_wood, median_wood] [0.3, 0.4, median_wood, light_wood] [0.4, 0.7, light_wood, light_wood] [0.7, 0.8, light_wood, median_wood] [0.8, 0.9, median_wood, light_wood] [0.9, 1.0, light_wood, dark_wood]) } } define glass texture { surface { ambient 0 diffuse 0 specular 0.2 reflection white, 0.1 transmission white, 1, 1.5 }} define shiny surface { ambient 0.1 diffuse 0.6 specular white, 0.6 microfacet Phong 7  } define steely_blue texture { shiny { color black } } define chrome texture { surface { color white ambient 0.0 diffuse 0.2 specular 0.4 microfacet Phong 10 reflection 0.8 } }   viewpoint {     from <4.000, -1.000, 1.000> at <0.000, 0.000, 0.000> up <0, 1, 0> angle 60     resolution 640, 480 aspect 1.6 image_format 0 }       light <-10, 30, 20> light <-10, 30, -20>   object { disc <0, -2, 0>, <0, 1, 0>, 30 wooden }   object { sphere <0.000, 0.000, 0.000>, 1.00 chrome } object { cylinder <0.000, 0.000, 0.000>, <0.000, 0.000, -4.000>, 0.50 chrome }   After setting up the background and defining colors and textures, the viewpoint is specified. The “camera” is located at a point in 3D space, and it looks towards another point. The angle, image resolution, and aspect ratio are specified. Two lights are present in the image at defined coordinates. The three objects in the image are a wooden disc to represent a table top, and a sphere and cylinder that intersect to form a pin that will be used for the pin board toy in the final animation. When the image is rendered, the following image is produced. The pins are modeled with a chrome surface, so they reflect the environment around them. Note that the scale of the pin shaft is not correct, this will be fixed later. Modeling the Pin Board The frame of the pin-board is made up of three boxes, and six cylinders, the front box is modeled using a clear, slightly reflective solid, with the same refractive index of glass. The other shapes are modeled as metal. object { box <-5.5, -1.5, 1>, <5.5, 5.5, 1.2> glass } object { box <-5.5, -1.5, -0.04>, <5.5, 5.5, -0.09> steely_blue } object { box <-5.5, -1.5, -0.52>, <5.5, 5.5, -0.59> steely_blue } object { cylinder <-5.2, -1.2, 1.4>, <-5.2, -1.2, -0.74>, 0.2 steely_blue } object { cylinder <5.2, -1.2, 1.4>, <5.2, -1.2, -0.74>, 0.2 steely_blue } object { cylinder <-5.2, 5.2, 1.4>, <-5.2, 5.2, -0.74>, 0.2 steely_blue } object { cylinder <5.2, 5.2, 1.4>, <5.2, 5.2, -0.74>, 0.2 steely_blue } object { cylinder <0, -1.2, 1.4>, <0, -1.2, -0.74>, 0.2 steely_blue } object { cylinder <0, 5.2, 1.4>, <0, 5.2, -0.74>, 0.2 steely_blue }   In order to create the matrix of pins that make up the pin board I used a basic console application with a few nested loops to create two intersecting matrixes of pins, which models the layout used in the pin boards. The resulting image is shown below. The pin board contains 11,481 pins, with the scene file containing 23,709 lines of code. For the complete animation 2,000 scene files will be created, which is over 47 million lines of code. Each pin in the pin-board will slide out a specific distance when an object is pressed into the back of the board. This is easily modeled by setting the Z coordinate of the pin to a specific value. In order to set all of the pins in the pin-board to the correct position, a bitmap image can be used. The position of the pin can be set based on the color of the pixel at the appropriate position in the image. When the Windows Azure logo is used to set the Z coordinate of the pins, the following image is generated. The challenge now was to make a cool animation. The Azure Logo is fine, but it is static. Using a normal video to animate the pins would not work; the colors in the video would not be the same as the depth of the objects from the camera. In order to simulate the pin board accurately a series of frames from a depth camera could be used. Windows Kinect The Kenect controllers for the X-Box 360 and Windows feature a depth camera. The Kinect SDK for Windows provides a programming interface for Kenect, providing easy access for .NET developers to the Kinect sensors. The Kinect Explorer provided with the Kinect SDK is a great starting point for exploring Kinect from a developers perspective. Both the X-Box 360 Kinect and the Windows Kinect will work with the Kinect SDK, the Windows Kinect is required for commercial applications, but the X-Box Kinect can be used for hobby projects. The Windows Kinect has the advantage of providing a mode to allow depth capture with objects closer to the camera, which makes for a more accurate depth image for setting the pin positions. Creating a Depth Field Animation The depth field animation used to set the positions of the pin in the pin board was created using a modified version of the Kinect Explorer sample application. In order to simulate the pin board accurately, a small section of the depth range from the depth sensor will be used. Any part of the object in front of the depth range will result in a white pixel; anything behind the depth range will be black. Within the depth range the pixels in the image will be set to RGB values from 0,0,0 to 255,255,255. A screen shot of the modified Kinect Explorer application is shown below. The Kinect Explorer sample application was modified to include slider controls that are used to set the depth range that forms the image from the depth stream. This allows the fine tuning of the depth image that is required for simulating the position of the pins in the pin board. The Kinect Explorer was also modified to record a series of images from the depth camera and save them as a sequence JPEG files that will be used to animate the pins in the animation the Start and Stop buttons are used to start and stop the image recording. En example of one of the depth images is shown below. Once a series of 2,000 depth images has been captured, the task of creating the animation can begin. Rendering a Test Frame In order to test the creation of frames and get an approximation of the time required to render each frame a test frame was rendered on-premise using PolyRay. The output of the rendering process is shown below. The test frame contained 23,629 primitive shapes, most of which are the spheres and cylinders that are used for the 11,800 or so pins in the pin board. The 1280x720 image contains 921,600 pixels, but as anti-aliasing was used the number of rays that were calculated was 4,235,777, with 3,478,754,073 object boundaries checked. The test frame of the pin board with the depth field image applied is shown below. The tracing time for the test frame was 4 minutes 27 seconds, which means rendering the2,000 frames in the animation would take over 148 hours, or a little over 6 days. Although this is much faster that an old 486, waiting almost a week to see the results of an animation would make it challenging for animators to create, view, and refine their animations. It would be much better if the animation could be rendered in less than one hour. Windows Azure Worker Roles The cost of creating an on-premise render farm to render animations increases in proportion to the number of servers. The table below shows the cost of servers for creating a render farm, assuming a cost of $500 per server. Number of Servers Cost 1 $500 16 $8,000 256 $128,000   As well as the cost of the servers, there would be additional costs for networking, racks etc. Hosting an environment of 256 servers on-premise would require a server room with cooling, and some pretty hefty power cabling. The Windows Azure compute services provide worker roles, which are ideal for performing processor intensive compute tasks. With the scalability available in Windows Azure a job that takes 256 hours to complete could be perfumed using different numbers of worker roles. The time and cost of using 1, 16 or 256 worker roles is shown below. Number of Worker Roles Render Time Cost 1 256 hours $30.72 16 16 hours $30.72 256 1 hour $30.72   Using worker roles in Windows Azure provides the same cost for the 256 hour job, irrespective of the number of worker roles used. Provided the compute task can be broken down into many small units, and the worker role compute power can be used effectively, it makes sense to scale the application so that the task is completed quickly, making the results available in a timely fashion. The task of rendering 2,000 frames in an animation is one that can easily be broken down into 2,000 individual pieces, which can be performed by a number of worker roles. Creating a Render Farm in Windows Azure The architecture of the render farm is shown in the following diagram. The render farm is a hybrid application with the following components: ·         On-Premise o   Windows Kinect – Used combined with the Kinect Explorer to create a stream of depth images. o   Animation Creator – This application uses the depth images from the Kinect sensor to create scene description files for PolyRay. These files are then uploaded to the jobs blob container, and job messages added to the jobs queue. o   Process Monitor – This application queries the role instance lifecycle table and displays statistics about the render farm environment and render process. o   Image Downloader – This application polls the image queue and downloads the rendered animation files once they are complete. ·         Windows Azure o   Azure Storage – Queues and blobs are used for the scene description files and completed frames. A table is used to store the statistics about the rendering environment.   The architecture of each worker role is shown below.   The worker role is configured to use local storage, which provides file storage on the worker role instance that can be use by the applications to render the image and transform the format of the image. The service definition for the worker role with the local storage configuration highlighted is shown below. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="CloudRay" >   <WorkerRole name="CloudRayWorkerRole" vmsize="Small">     <Imports>     </Imports>     <ConfigurationSettings>       <Setting name="DataConnectionString" />     </ConfigurationSettings>     <LocalResources>       <LocalStorage name="RayFolder" cleanOnRoleRecycle="true" />     </LocalResources>   </WorkerRole> </ServiceDefinition>     The two executable programs, PolyRay.exe and DTA.exe are included in the Azure project, with Copy Always set as the property. PolyRay will take the scene description file and render it to a Truevision TGA file. As the TGA format has not seen much use since the mid 90’s it is converted to a JPG image using Dave's Targa Animator, another shareware application from the 90’s. Each worker roll will use the following process to render the animation frames. 1.       The worker process polls the job queue, if a job is available the scene description file is downloaded from blob storage to local storage. 2.       PolyRay.exe is started in a process with the appropriate command line arguments to render the image as a TGA file. 3.       DTA.exe is started in a process with the appropriate command line arguments convert the TGA file to a JPG file. 4.       The JPG file is uploaded from local storage to the images blob container. 5.       A message is placed on the images queue to indicate a new image is available for download. 6.       The job message is deleted from the job queue. 7.       The role instance lifecycle table is updated with statistics on the number of frames rendered by the worker role instance, and the CPU time used. The code for this is shown below. public override void Run() {     // Set environment variables     string polyRayPath = Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), PolyRayLocation);     string dtaPath = Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), DTALocation);       LocalResource rayStorage = RoleEnvironment.GetLocalResource("RayFolder");     string localStorageRootPath = rayStorage.RootPath;       JobQueue jobQueue = new JobQueue("renderjobs");     JobQueue downloadQueue = new JobQueue("renderimagedownloadjobs");     CloudRayBlob sceneBlob = new CloudRayBlob("scenes");     CloudRayBlob imageBlob = new CloudRayBlob("images");     RoleLifecycleDataSource roleLifecycleDataSource = new RoleLifecycleDataSource();       Frames = 0;       while (true)     {         // Get the render job from the queue         CloudQueueMessage jobMsg = jobQueue.Get();           if (jobMsg != null)         {             // Get the file details             string sceneFile = jobMsg.AsString;             string tgaFile = sceneFile.Replace(".pi", ".tga");             string jpgFile = sceneFile.Replace(".pi", ".jpg");               string sceneFilePath = Path.Combine(localStorageRootPath, sceneFile);             string tgaFilePath = Path.Combine(localStorageRootPath, tgaFile);             string jpgFilePath = Path.Combine(localStorageRootPath, jpgFile);               // Copy the scene file to local storage             sceneBlob.DownloadFile(sceneFilePath);               // Run the ray tracer.             string polyrayArguments =                 string.Format("\"{0}\" -o \"{1}\" -a 2", sceneFilePath, tgaFilePath);             Process polyRayProcess = new Process();             polyRayProcess.StartInfo.FileName =                 Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), polyRayPath);             polyRayProcess.StartInfo.Arguments = polyrayArguments;             polyRayProcess.Start();             polyRayProcess.WaitForExit();               // Convert the image             string dtaArguments =                 string.Format(" {0} /FJ /P{1}", tgaFilePath, Path.GetDirectoryName (jpgFilePath));             Process dtaProcess = new Process();             dtaProcess.StartInfo.FileName =                 Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), dtaPath);             dtaProcess.StartInfo.Arguments = dtaArguments;             dtaProcess.Start();             dtaProcess.WaitForExit();               // Upload the image to blob storage             imageBlob.UploadFile(jpgFilePath);               // Add a download job.             downloadQueue.Add(jpgFile);               // Delete the render job message             jobQueue.Delete(jobMsg);               Frames++;         }         else         {             Thread.Sleep(1000);         }           // Log the worker role activity.         roleLifecycleDataSource.Alive             ("CloudRayWorker", RoleLifecycleDataSource.RoleLifecycleId, Frames);     } }     Monitoring Worker Role Instance Lifecycle In order to get more accurate statistics about the lifecycle of the worker role instances used to render the animation data was tracked in an Azure storage table. The following class was used to track the worker role lifecycles in Azure storage.   public class RoleLifecycle : TableServiceEntity {     public string ServerName { get; set; }     public string Status { get; set; }     public DateTime StartTime { get; set; }     public DateTime EndTime { get; set; }     public long SecondsRunning { get; set; }     public DateTime LastActiveTime { get; set; }     public int Frames { get; set; }     public string Comment { get; set; }       public RoleLifecycle()     {     }       public RoleLifecycle(string roleName)     {         PartitionKey = roleName;         RowKey = Utils.GetAscendingRowKey();         Status = "Started";         StartTime = DateTime.UtcNow;         LastActiveTime = StartTime;         EndTime = StartTime;         SecondsRunning = 0;         Frames = 0;     } }     A new instance of this class is created and added to the storage table when the role starts. It is then updated each time the worker renders a frame to record the total number of frames rendered and the total processing time. These statistics are used be the monitoring application to determine the effectiveness of use of resources in the render farm. Rendering the Animation The Azure solution was deployed to Windows Azure with the service configuration set to 16 worker role instances. This allows for the application to be tested in the cloud environment, and the performance of the application determined. When I demo the application at conferences and user groups I often start with 16 instances, and then scale up the application to the full 256 instances. The configuration to run 16 instances is shown below. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="CloudRay" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*">   <Role name="CloudRayWorkerRole">     <Instances count="16" />     <ConfigurationSettings>       <Setting name="DataConnectionString"         value="DefaultEndpointsProtocol=https;AccountName=cloudraydata;AccountKey=..." />     </ConfigurationSettings>   </Role> </ServiceConfiguration>     About six minutes after deploying the application the first worker roles become active and start to render the first frames of the animation. The CloudRay Monitor application displays an icon for each worker role instance, with a number indicating the number of frames that the worker role has rendered. The statistics on the left show the number of active worker roles and statistics about the render process. The render time is the time since the first worker role became active; the CPU time is the total amount of processing time used by all worker role instances to render the frames.   Five minutes after the first worker role became active the last of the 16 worker roles activated. By this time the first seven worker roles had each rendered one frame of the animation.   With 16 worker roles u and running it can be seen that one hour and 45 minutes CPU time has been used to render 32 frames with a render time of just under 10 minutes.     At this rate it would take over 10 hours to render the 2,000 frames of the full animation. In order to complete the animation in under an hour more processing power will be required. Scaling the render farm from 16 instances to 256 instances is easy using the new management portal. The slider is set to 256 instances, and the configuration saved. We do not need to re-deploy the application, and the 16 instances that are up and running will not be affected. Alternatively, the configuration file for the Azure service could be modified to specify 256 instances.   <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="CloudRay" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*">   <Role name="CloudRayWorkerRole">     <Instances count="256" />     <ConfigurationSettings>       <Setting name="DataConnectionString"         value="DefaultEndpointsProtocol=https;AccountName=cloudraydata;AccountKey=..." />     </ConfigurationSettings>   </Role> </ServiceConfiguration>     Six minutes after the new configuration has been applied 75 new worker roles have activated and are processing their first frames.   Five minutes later the full configuration of 256 worker roles is up and running. We can see that the average rate of frame rendering has increased from 3 to 12 frames per minute, and that over 17 hours of CPU time has been utilized in 23 minutes. In this test the time to provision 140 worker roles was about 11 minutes, which works out at about one every five seconds.   We are now half way through the rendering, with 1,000 frames complete. This has utilized just under three days of CPU time in a little over 35 minutes.   The animation is now complete, with 2,000 frames rendered in a little over 52 minutes. The CPU time used by the 256 worker roles is 6 days, 7 hours and 22 minutes with an average frame rate of 38 frames per minute. The rendering of the last 1,000 frames took 16 minutes 27 seconds, which works out at a rendering rate of 60 frames per minute. The frame counts in the server instances indicate that the use of a queue to distribute the workload has been very effective in distributing the load across the 256 worker role instances. The first 16 instances that were deployed first have rendered between 11 and 13 frames each, whilst the 240 instances that were added when the application was scaled have rendered between 6 and 9 frames each.   Completed Animation I’ve uploaded the completed animation to YouTube, a low resolution preview is shown below. Pin Board Animation Created using Windows Kinect and 256 Windows Azure Worker Roles   The animation can be viewed in 1280x720 resolution at the following link: http://www.youtube.com/watch?v=n5jy6bvSxWc Effective Use of Resources According to the CloudRay monitor statistics the animation took 6 days, 7 hours and 22 minutes CPU to render, this works out at 152 hours of compute time, rounded up to the nearest hour. As the usage for the worker role instances are billed for the full hour, it may have been possible to render the animation using fewer than 256 worker roles. When deciding the optimal usage of resources, the time required to provision and start the worker roles must also be considered. In the demo I started with 16 worker roles, and then scaled the application to 256 worker roles. It would have been more optimal to start the application with maybe 200 worker roles, and utilized the full hour that I was being billed for. This would, however, have prevented showing the ease of scalability of the application. The new management portal displays the CPU usage across the worker roles in the deployment. The average CPU usage across all instances is 93.27%, with over 99% used when all the instances are up and running. This shows that the worker role resources are being used very effectively. Grid Computing Scenarios Although I am using this scenario for a hobby project, there are many scenarios where a large amount of compute power is required for a short period of time. Windows Azure provides a great platform for developing these types of grid computing applications, and can work out very cost effective. ·         Windows Azure can provide massive compute power, on demand, in a matter of minutes. ·         The use of queues to manage the load balancing of jobs between role instances is a simple and effective solution. ·         Using a cloud-computing platform like Windows Azure allows proof-of-concept scenarios to be tested and evaluated on a very low budget. ·         No charges for inbound data transfer makes the uploading of large data sets to Windows Azure Storage services cost effective. (Transaction charges still apply.) Tips for using Windows Azure for Grid Computing Scenarios I found the implementation of a render farm using Windows Azure a fairly simple scenario to implement. I was impressed by ease of scalability that Azure provides, and by the short time that the application took to scale from 16 to 256 worker role instances. In this case it was around 13 minutes, in other tests it took between 10 and 20 minutes. The following tips may be useful when implementing a grid computing project in Windows Azure. ·         Using an Azure Storage queue to load-balance the units of work across multiple worker roles is simple and very effective. The design I have used in this scenario could easily scale to many thousands of worker role instances. ·         Windows Azure accounts are typically limited to 20 cores. If you need to use more than this, a call to support and a credit card check will be required. ·         Be aware of how the billing model works. You will be charged for worker role instances for the full clock our in which the instance is deployed. Schedule the workload to start just after the clock hour has started. ·         Monitor the utilization of the resources you are provisioning, ensure that you are not paying for worker roles that are idle. ·         If you are deploying third party applications to worker roles, you may well run into licensing issues. Purchasing software licenses on a per-processor basis when using hundreds of processors for a short time period would not be cost effective. ·         Third party software may also require installation onto the worker roles, which can be accomplished using start-up tasks. Bear in mind that adding a startup task and possible re-boot will add to the time required for the worker role instance to start and activate. An alternative may be to use a prepared VM and use VM roles. ·         Consider using the Windows Azure Autoscaling Application Block (WASABi) to autoscale the worker roles in your application. When using a large number of worker roles, the utilization must be carefully monitored, if the scaling algorithms are not optimal it could get very expensive!

    Read the article

  • Windows Phone appointment task

    - by Dennis Vroegop
    Originally posted on: http://geekswithblogs.net/dvroegop/archive/2014/08/10/windows-phone-appointment-task.aspxI am currently working on a new version of my AgeInDays app for Windows Phone. This app calculates how old you are in days (or weeks, depending on your preferences). The inspiration for this app came from my father, who once told me he proposed to my mother when she was 1000 weeks old. That left me wondering: how old in weeks or days am I? And being the geek I am, I wrote an app for it. If you have a Windows Phone, you can find it at http://www.windowsphone.com/en-in/store/app/age-in-days/7ed03603-0e00-4214-ad04-ce56773e5dab A new version of the app was published quite quickly, adding the possibility to mark a date in your agenda when you would have reached a certain age. Of course the logic behind this if extremely simple. Just take a DateTime, populate it with the given date from the DatePicker, then call AddDays(numDays) and voila, you have the date. Now all I had to do was implement a way to store this in the users calendar so he would get a reminder when that date occurred. Luckily, the Windows Phone SDK makes that extremely simple: public void PublishTask(DateTime occuranceDate, string message) { var task = new SaveAppointmentTask() { StartTime = occuranceDate, EndTime = occuranceDate, Subject = message, Location = string.Empty, IsAllDayEvent = true, Reminder = Reminder.None, AppointmentStatus = AppointmentStatus.Free };   task.Show(); }  And that's it. Whenever I call the PublishTask Method an appointment will be made and put in the calendar. Well, not exactly: a template will be made for that appointment and the user will see that template, giving him the option to either discard or save the reminder. The user can also make changes before submitting this to the calendar: it would be useful to be able to change the text in the agenda and that's exactly what this allows you to do. Now, see at the bottom of the screen the option "Occurs". This tiny field is what this post is about. You cannot set it from the code. I want to be able to have repeating items in my agenda. Say for instance you're counting down to a certain date, I want to be able to give you that option as well. However, I cannot. The field "occurs" is not part of the Task you create in code. Of course, you could create a whole series of events yourself. Have the "Occurs" field in your own user interface and make all the appointments. But that's not the same. First, the system doesn't recognize them as part of a series. That means if you want to change the text later on on one of the occurrences it will not ask you if you want to open this one or the whole series. More important however, is that the user has to acknowledge each and every single occurrence and save that into the agenda. Now, I understand why they implemented the system in such a way that the user has to approve an entry. You don't want apps to automatically fill your agenda with messages such as "Remember to pay for my app!". But why not include the "Occurs" option? The user can still opt out if they see this happening. I hope an update will fix this soon. But for now: you just have to countdown to your birthday yourself. My app won't support this.

    Read the article

  • When does a Tumbling Window Start in StreamInsight

    Whilst getting some courseware ready I was playing around writing some code and I decided to very simply show when a window starts and ends based on you asking for a TumblingWindow of n time units in StreamInsight.  I thought this was going to be a two second thing but what I found was something I haven’t yet found documented anywhere until now.   All this code is written in C# and will slot straight into my favourite quick-win dev tool LinqPad   Let’s first create a sample dataset   var EnumerableCollection = new [] { new {id = 1, StartTime = DateTime.Parse("2010-10-01 12:00:00 PM").ToLocalTime()}, new {id = 2, StartTime = DateTime.Parse("2010-10-01 12:20:00 PM").ToLocalTime()}, new {id = 3, StartTime = DateTime.Parse("2010-10-01 12:30:00 PM").ToLocalTime()}, new {id = 4, StartTime = DateTime.Parse("2010-10-01 12:40:00 PM").ToLocalTime()}, new {id = 5, StartTime = DateTime.Parse("2010-10-01 12:50:00 PM").ToLocalTime()}, new {id = 6, StartTime = DateTime.Parse("2010-10-01 01:00:00 PM").ToLocalTime()}, new {id = 7, StartTime = DateTime.Parse("2010-10-01 01:10:00 PM").ToLocalTime()}, new {id = 8, StartTime = DateTime.Parse("2010-10-01 02:00:00 PM").ToLocalTime()}, new {id = 9, StartTime = DateTime.Parse("2010-10-01 03:20:00 PM").ToLocalTime()}, new {id = 10, StartTime = DateTime.Parse("2010-10-01 03:30:00 PM").ToLocalTime()}, new {id = 11, StartTime = DateTime.Parse("2010-10-01 04:40:00 PM").ToLocalTime()}, new {id = 12, StartTime = DateTime.Parse("2010-10-01 04:50:00 PM").ToLocalTime()}, new {id = 13, StartTime = DateTime.Parse("2010-10-01 05:00:00 PM").ToLocalTime()}, new {id = 14, StartTime = DateTime.Parse("2010-10-01 05:10:00 PM").ToLocalTime()} };   Now let’s create a stream of point events   var inputStream = EnumerableCollection .ToPointStream(Application,evt=> PointEvent .CreateInsert(evt.StartTime,evt),AdvanceTimeSettings.StrictlyIncreasingStartTime);   Now we can create our windows over the stream.  The first window we will create is a one hour tumbling window.  We’'ll count the events in the window but what we do here is not the point, the point is our window edges.   var windowedStream = from win in inputStream.TumblingWindow(TimeSpan.FromHours(1),HoppingWindowOutputPolicy.ClipToWindowEnd) select new {CountOfEntries = win.Count()};   Now we can have a look at what we get.  I am only going to show the first non Cti event as that is enough to demonstrate what is going on   windowedStream.ToIntervalEnumerable().First(e=> e.EventKind == EventKind.Insert).Dump("First Row from Windowed Stream");   The results are below   EventKind Insert   StartTime 01/10/2010 12:00   EndTime 01/10/2010 13:00     { CountOfEntries = 5 }   Payload CountOfEntries 5   Now this makes sense and is quite often the width of window specified in examples.  So what happens if I change the windowing code now to var windowedStream = from win in inputStream.TumblingWindow(TimeSpan.FromHours(5),HoppingWindowOutputPolicy.ClipToWindowEnd) select new {CountOfEntries = win.Count()}; Now where does your window start?  What about   var windowedStream = from win in inputStream.TumblingWindow(TimeSpan.FromMinutes(13),HoppingWindowOutputPolicy.ClipToWindowEnd) select new {CountOfEntries = win.Count()};   Well for the first example your window will start at 01/10/2010 10:00:00 , and for the second example it will start at  01/10/2010 11:55:00 Surprised?   Here is the reason why and thanks to the StreamInsight team for listening.   Windows start at TimeSpan.MinValue. Windows are then created from that point onwards of the size you specified in your code.  If a window contains no events they are not produced by the engine to the output.  This is why window start times can be before the first event is created.

    Read the article

  • XtraGrid Suite - is there a way to add a button or hyperlink to a cell?

    - by calico-cat
    I'm working with the XtraGrid Suite made by DevExpress. I can't find any sort of functionality to do this, but I'm curious if you can add a button or hyperlink to a grid cell. Context: I've got an Events list. Each Event has a Time, Start/End, and a Category (Utility and Maintenance). There can be Start events and Stop events. Having done my analysis of the problem, I've decided that having a StartTime and EndTime for each event would not work. So if an event starts, I'd record the current time to the Event object, and set it as a 'Start' event. I'd like to add a "Stop" button/hyperlink to a cell in that row. If the user wishes to log an Ends event, the event type, etc would be copied to a new Event with the type 'Stop' and the button would disappear. I hope this makes sense. EDIT: Aaronaught's answer is actually better than what I was originally asking (a button) so I've updated the question. That way, anyone looking for putting a hyperlink in a cell can benefit from his example : )

    Read the article

  • Notification CeSetUserNotificationEx with custom sound

    - by inTagger
    Hail all! I want to display notification and play custom sound on my Windows Mobile 5/6 device. I have tried something like that, but my custom sound does not play, though message is displayed with standart sound. If i edit Wave key in [HKEY_CURRENT_USER\ControlPanel\Notifications{15F11F90-8A5F-454c-89FC-BA9B7AAB0CAD}] to sound file i need then it plays okay. But why there are flag NotificationAction.Sound and property UserNotification.Sound? It doesn't work. Also Vibration and Led don't work, if i use such flags. (You can obtain full project sources from http://dl.dropbox.com/u/1758206/Code/Thunder.zip) var trigger = new UserNotificationTrigger { StartTime = DateTime.Now + TimeSpan.FromSeconds(1), Type = NotificationType.ClassicTime }; var userNotification = new UserNotification { Sound = @"\Windows\Alarm1.wma", Text = "Hail from Penza, Russia!", Action = NotificationAction.Dialog | NotificationAction.Sound, Title = string.Empty, MaxSound = 16384 }; NotificationTools.SetUserNotification(0, trigger, userNotification); UserNotificationTrigger.cs: using System; using System.Runtime.InteropServices; namespace Thunder.Lib.ThunderMethod1 { /// <summary> /// Specifies the type of notification. /// </summary> public enum NotificationType { /// <summary> /// Equivalent to using the SetUserNotification function. /// The standard command line is supplied. /// </summary> ClassicTime = 4, /// <summary> /// System event notification. /// </summary> Event = 1, /// <summary> /// Time-based notification that is active for the time period between StartTime and EndTime. /// </summary> Period = 3, /// <summary> /// Time-based notification. /// </summary> Time = 2 } /// <summary> /// System Event Flags /// </summary> public enum NotificationEvent { None, TimeChange, SyncEnd, OnACPower, OffACPower, NetConnect, NetDisconnect, DeviceChange, IRDiscovered, RS232Detected, RestoreEnd, Wakeup, TimeZoneChange, MachineNameChange, RndisFNDetected, InternetProxyChange } /// <summary> /// Defines what event activates a notification. /// </summary> [StructLayout(LayoutKind.Sequential)] public class UserNotificationTrigger { internal int dwSize = 52; private int dwType; private int dwEvent; [MarshalAs(UnmanagedType.LPWStr)] private string lpszApplication = string.Empty; [MarshalAs(UnmanagedType.LPWStr)] private string lpszArguments; internal SYSTEMTIME stStartTime; internal SYSTEMTIME stEndTime; /// <summary> /// Specifies the type of notification. /// </summary> public NotificationType Type { get { return (NotificationType) dwType; } set { dwType = (int) value; } } /// <summary> /// Specifies the type of event should Type = Event. /// </summary> public NotificationEvent Event { get { return (NotificationEvent) dwEvent; } set { dwEvent = (int) value; } } /// <summary> /// Name of the application to execute. /// </summary> public string Application { get { return lpszApplication; } set { lpszApplication = value; } } /// <summary> /// Command line (without the application name). /// </summary> public string Arguments { get { return lpszArguments; } set { lpszArguments = value; } } /// <summary> /// Specifies the beginning of the notification period. /// </summary> public DateTime StartTime { get { return stStartTime.ToDateTime(); } set { stStartTime = SYSTEMTIME.FromDateTime(value); } } /// <summary> /// Specifies the end of the notification period. /// </summary> public DateTime EndTime { get { return stEndTime.ToDateTime(); } set { stEndTime = SYSTEMTIME.FromDateTime(value); } } } } UserNotification.cs: using System.Runtime.InteropServices; namespace Thunder.Lib.ThunderMethod1 { /// <summary> /// Contains information used for a user notification. /// </summary> [StructLayout(LayoutKind.Sequential)] public class UserNotification { private int ActionFlags; [MarshalAs(UnmanagedType.LPWStr)] private string pwszDialogTitle; [MarshalAs(UnmanagedType.LPWStr)] private string pwszDialogText; [MarshalAs(UnmanagedType.LPWStr)] private string pwszSound; private int nMaxSound; private int dwReserved; /// <summary> /// Any combination of the <see cref="T:Thunder.Lib.NotificationAction" /> members. /// </summary> /// <value>Flags which specifies the action(s) to be taken when the notification is triggered.</value> /// <remarks>Flags not valid on a given hardware platform will be ignored.</remarks> public NotificationAction Action { get { return (NotificationAction) ActionFlags; } set { ActionFlags = (int) value; } } /// <summary> /// Required if NotificationAction.Dialog is set, ignored otherwise /// </summary> public string Title { get { return pwszDialogTitle; } set { pwszDialogTitle = value; } } /// <summary> /// Required if NotificationAction.Dialog is set, ignored otherwise. /// </summary> public string Text { get { return pwszDialogText; } set { pwszDialogText = value; } } /// <summary> /// Sound string as supplied to PlaySound. /// </summary> public string Sound { get { return pwszSound; } set { pwszSound = value; } } public int MaxSound { get { return nMaxSound; } set { nMaxSound = value; } } } } NativeMethods.cs: using System; using System.Runtime.InteropServices; namespace Thunder.Lib.ThunderMethod1 { [StructLayout(LayoutKind.Sequential)] public struct SYSTEMTIME { public short wYear; public short wMonth; public short wDayOfWeek; public short wDay; public short wHour; public short wMinute; public short wSecond; public short wMillisecond; public static SYSTEMTIME FromDateTime(DateTime dt) { return new SYSTEMTIME { wYear = (short) dt.Year, wMonth = (short) dt.Month, wDayOfWeek = (short) dt.DayOfWeek, wDay = (short) dt.Day, wHour = (short) dt.Hour, wMinute = (short) dt.Minute, wSecond = (short) dt.Second, wMillisecond = (short) dt.Millisecond }; } public DateTime ToDateTime() { if ((((wYear == 0) && (wMonth == 0)) && ((wDay == 0) && (wHour == 0))) && ((wMinute == 0) && (wSecond == 0))) return DateTime.MinValue; return new DateTime(wYear, wMonth, wDay, wHour, wMinute, wSecond, wMillisecond); } } /// <summary> /// Specifies the action to take when a notification event occurs. /// </summary> [Flags] public enum NotificationAction { /// <summary> /// Displays the user notification dialog box. /// </summary> Dialog = 4, /// <summary> /// Flashes the LED. /// </summary> Led = 1, /// <summary> /// Dialog box z-order flag. /// Set if the notification dialog box should come up behind the password. /// </summary> Private = 32, /// <summary> /// Repeats the sound for 10–15 seconds. /// </summary> Repeat = 16, /// <summary> /// Plays the sound specified. /// </summary> Sound = 8, /// <summary> /// Vibrates the device. /// </summary> Vibrate = 2 } internal class NativeMethods { [DllImport("coredll.dll", CallingConvention = CallingConvention.Winapi, CharSet = CharSet.Unicode, SetLastError = true)] internal static extern int CeSetUserNotificationEx(int hNotification, UserNotificationTrigger lpTrigger, UserNotification lpUserNotification); } } NotificationTools.cs: using System.ComponentModel; using System.Runtime.InteropServices; namespace Thunder.Lib.ThunderMethod1 { public static class NotificationTools { /// <summary> /// This function modifies an existing user notification. /// </summary> /// <param name="handle">Handle of the Notification to be modified</param> /// <param name="trigger">A UserNotificationTrigger that defines what event activates a notification.</param> /// <param name="notification">A UserNotification that defines how the system should respond when a notification occurs.</param> /// <returns>Handle to the notification event if successful.</returns> public static int SetUserNotification(int handle, UserNotificationTrigger trigger, UserNotification notification) { int num = NativeMethods.CeSetUserNotificationEx(handle, trigger, notification); if (num == 0) throw new Win32Exception(Marshal.GetLastWin32Error(), "Error setting UserNotification"); return num; } } }

    Read the article

  • Can a call to WaitHandle.SignalAndWait be ignored for performance profiling purposes?

    - by Dan Tao
    I just downloaded the trial version of ANTS Performance Profiler from Red Gate and am investigating some of my team's code. Immediately I notice that there's a particular section of code that ANTS is reporting as eating up to 99% CPU time. I am completely unfamiliar with ANTS or performance profiling in general (that is, aside from self-profiling using what I'm sure are extremely crude and frowned-upon methods such as double timeToComplete = (endTime - startTime).TotalSeconds), so I'm still fiddling around with the application and figuring out how it's used. But I did call the developer responsible for the code in question and his immediate reaction was "Yeah, that doesn't surprise me that it says that; but that code calls SignalAndWait [which I could see for myself, thanks to ANTS], which doesn't use any CPU, it just sits there waiting for something to do." He advised me to simply ignore that code and look for anything ELSE I could find. My question: is it true that SignalAndWait requires NO CPU overhead (and if so, how is this possible?), and is it reasonable that a performance profiler would view it as taking up 99% CPU time? I find this particularly curious because, if it's at 99%, that would suggest that our application is often idle, wouldn't it? And yet its performance has become rather sluggish lately. Like I said, I really am just a beginner when it comes to this tool, and I don't know anything about the WaitHandle class. So ANY information to help me to understand what's going on here would be appreciated.

    Read the article

  • slow SQL command

    - by Retrocoder
    I need to take some data from one table (and expand some XML on the way) and put it in another table. As the source table can have thousands or records which caused a timeout I decided to do it in batches of 100 records. The code is run on a schedule so doing it in batches works ok for the customer. If I have say 200 records in the source database the sproc runs very fast but if there are thousands it takes several minutes. I'm guessing that the "TOP 100" only takes the top 100 after it has gone through all the records. I need to change the whole code and sproc at some point as it doesn't scale but for now is there a quick fix to make this run quicker ? INSERT INTO [deviceManager].[TransactionLogStores] SELECT TOP 100 [EventId], [message].value('(/interface/mac)[1]', 'nvarchar(100)') AS mac, [message].value('(/interface/device) [1]', 'nvarchar(100)') AS device_type, [message].value('(/interface/id) [1]', 'nvarchar(100)') AS device_id, [message].value('substring(string((/interface/id)[1]), 1, 6)', 'nvarchar(100)') AS store_id, [message].value('(/interface/terminal/unit)[1]', 'nvarchar(100)') AS unit, [message].value('(/interface/terminal/trans/event)[1]', 'nvarchar(100)') AS event_id, [message].value('(/interface/terminal/trans/data)[1]', 'nvarchar(100)') AS event_data, [message].value('substring(string((/interface/terminal/trans/data)[1]), 9, 11)', 'nvarchar(100)') AS badge, [message].value('(/interface/terminal/trans/time)[1]', 'nvarchar(100)') AS terminal_time, MessageRecievedAt_UTC AS db_time FROM [deviceManager].[TransactionLog] WHERE EventId > @EventId --WHERE MessageRecievedAt_UTC > @StartTime AND MessageRecievedAt_UTC < @EndTime ORDER BY terminal_time DESC

    Read the article

  • Datatemplate binding

    - by Lasse O
    How can i achieve something like this: <ListView Name="OverviewTitlesListView" ItemsSource="{Binding OverviewTitlesCollection}"> <ListView.View> <GridView> <GridViewColumn Header="Index" Width="60" DisplayMemberBinding="{Binding TitleIndex}"/> <GridViewColumn Header="Start Time" Width="100" DisplayMemberBinding="{Binding StartTime}"/> <GridViewColumn Header="End Time" Width="100" DisplayMemberBinding="{Binding EndTime}"/> <GridViewColumn Header="Title Text" Width="550" DisplayMemberBinding="{Binding Text}"/> <GridViewColumn Header="Approved" Width="80"> <GridViewColumn.CellTemplate> <DataTemplate> <TextBlock Name="Test"/> <DataTemplate.Triggers> <Trigger Property="{Binding IsApproved}" Value="true"> <Setter TargetName="Test" Property="Text" Value="Approved"/> </Trigger> <Trigger Property="{Binding IsApproved}" Value="false"> <Setter TargetName="Test" Property="Text" Value="Not Approved"/> </Trigger> </DataTemplate.Triggers> </DataTemplate> </GridViewColumn.CellTemplate> </GridViewColumn> </GridView> </ListView.View> </ListView> When IsApproved property changes on my object in OverviewTitlesCollection i want to control the text of the TextBlock. How can i control this by triggers in my datatemplate?

    Read the article

  • gridview column popup window

    - by peter
    i want to implement ajax hover menu and i have a grid gridview1 like this <asp:GridView ID="GridView1" OnRowCommand="ScheduleGridView_RowCommand" runat="server" AutoGenerateColumns="False" Height="60px" Style="text-align: center" Width="869px" EnableViewState="False"> <Columns> <asp:BoundField HeaderText="Topic" DataField="Topic" /> <asp:BoundField DataField="Moderator" HeaderText="Moderator" /> <asp:BoundField DataField="Expert" HeaderText="Expert" /> <asp:BoundField DataField="StartTime" HeaderText="Start"> <HeaderStyle Width="175px" /> </asp:BoundField> <asp:BoundField DataField="EndTime" HeaderText="End"> <HeaderStyle Width="175px" /> </asp:BoundField> <asp:TemplateField HeaderText="Join" ShowHeader="False"> <ItemTemplate> <asp:Button ID="JoinBT" runat="server" CommandName="Join" Text="Join" Width="52px" /> </ItemTemplate> <HeaderStyle Height="15px" /> </asp:TemplateField> </Columns> </asp:GridView> so i registered <%@ Register Assembly="AjaxControlToolkit" Namespace="AjaxControlToolkit" TagPrefix="asp" %> and added I added this way in code of gridview columns But i am getting fixed Edit/delete link in a new column rather than Hover menu,,Can any one tell me the solution to get hover menu

    Read the article

< Previous Page | 1 2 3 4 5 6  | Next Page >