Search Results

Search found 23545 results on 942 pages for 'parallel task library'.

Page 10/942 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • Updating a script currently being ran by Task Scheduler on Windows

    - by orangechicken
    I have a scheduled task that runs a script on a ahem schedule ahem that updates a local git repo. This script is a file in this local git repo. Currently, what I'm seeing is that the script is ran, git complains that permissions are denied to write to file which actually results in the script being deleted! The next time the scheduled task runs the script file is now missing! How can I ensure that when I pull changes to this script from the repo that the file is actually updated?

    Read the article

  • Win Server 2008: Task Scheduler runs programs twice or late

    - by SomeName
    Hi, I need to restart a service every day. I have logon hours restricted at 3:00 am, and the server will logout existing TS connections. I have two tasks scheduled: "Daily At 3:20 am every day" "start a program" "c:\windows\system32\sc.exe stop myservice" "Daily At 3:22 am every day" "start a program" "c:\windows\system32\sc.exe start myservice" I came in today to notice that the service wasn't running. I've been digging in logs, and found these entries: For stop task, history: a) 3:29:35 am: Action Completed (sc result code 0) b) 3:20:00 am: Action Completed (sc result code 0) For start task, history: a) 3:29:35 am: Action Completed (sc result code ERROR_SERVICE_ALREADY_RUNNING 1056 (0x420)) b) 3:22:01 am: Action Completed (sc result code 0) Checking event logs shows me: a) 3:29:35 am, Application log, Source myservice, "The service was stopped" b) 3:29:25 am, System log, Source Service Control Manager, "The myservice service entered the stopped state" So, What would have caused both tasks to run at 3:29 am? Why don't I see a message from the SCM saying that the service entered the running state? Is this the preferred way to do this? Thanks!

    Read the article

  • Tracking Administrator account for Domain Controller

    - by Param
    Have you ever created a Task Scheduler Event Notification via Email regarding password change or wrong attempt for administrator? Ok Let me Elaborate more.... As we know, that Administrator / domain admin / Enterprise admin is very important. So i want to keep a track of the following event - A) I must received a Email, whenever password is change of the administrator account - with date, time and ip address B) I must received a Email notification, whenever Administrator logs in Successfully or Unsuccessfully with date, time and ip address I am thinking to do the above task with Task Scheduler Event Notification, have you ever done with the same method? Thanks & Regards, Param

    Read the article

  • Windows Task Scheduler

    - by Zulakis
    i am trying to deploy a auto-starting program with Administrator Priviliges on our XP-SP1-machines. For this, i am using the Windows Task Scheduler. Since most of our machines get deployed by using a PXE-imaging-system, the Task fails because the Administrator user entered is for example r126/Administrator. If i only enter Administrator then it automatically changes to machinename/Administrator. Since the machinenames are automatically changes by the imaging-system, the tasks fail run. Any ideas on how to fix that?

    Read the article

  • Parallel.For System.OutOfMemoryException

    - by Martin Neal
    We have a fairly simple program that's used for creating backups. I'm attempting to parallelize it but am getting an OutofMemoryException within an AggregateExcption. Some of the source folders are quite large, and the program doesn't crash for about 40 minutes after it starts. I don't know where to start looking so the below code is a near exact dump of all code the code sans directory structure and Exception logging code. Any advise as to where to start looking? using System; using System.Diagnostics; using System.IO; using System.Threading.Tasks; namespace SelfBackup { class Program { static readonly string[] saSrc = { "\\src\\dir1\\", //... "\\src\\dirN\\", //this folder is over 6 GB }; static readonly string[] saDest = { "\\dest\\dir1\\", //... "\\dest\\dirN\\", }; static void Main(string[] args) { Parallel.For(0, saDest.Length, i => { try { if (Directory.Exists(sDest)) { //Delete directory first so old stuff gets cleaned up Directory.Delete(sDest, true); } //recursive function clsCopyDirectory.copyDirectory(saSrc[i], sDest); } catch (Exception e) { //standard error logging CL.EmailError(); } }); } } /////////////////////////////////////// using System.IO; using System.Threading.Tasks; namespace SelfBackup { static class clsCopyDirectory { static public void copyDirectory(string Src, string Dst) { Directory.CreateDirectory(Dst); /* Copy all the files in the folder If and when .NET 4.0 is installed, change Directory.GetFiles to Directory.Enumerate files for slightly better performance.*/ Parallel.ForEach<string>(Directory.GetFiles(Src), file => { /* An exception thrown here may be arbitrarily deep into this recursive function there's also a good chance that if one copy fails here, so too will other files in the same directory, so we don't want to spam out hundreds of error e-mails but we don't want to abort all together. Instead, the best solution is probably to throw back up to the original caller of copy directory an move on to the next Src/Dst pair by not catching any possible exception here.*/ File.Copy(file, //src Path.Combine(Dst, Path.GetFileName(file)), //dest true);//bool overwrite }); //Call this function again for every directory in the folder. Parallel.ForEach(Directory.GetDirectories(Src), dir => { copyDirectory(dir, Path.Combine(Dst, Path.GetFileName(dir))); }); } }

    Read the article

  • NHybernate, the Parallel Framework, and SQL Server

    - by andy
    hey guys, we have a loop that: 1.Loops over several thousand xml files. Altogether we're parsing millions of "user" nodes. 2.In each iteration we parse a "user" xml, do custom deserialization 3.finally, in each iteration, we send our object to nhibernate for saving. We use: .SaveOrUpdateAndFlush(user); This is a lengthy process, and we thought it would be a perfect candidate for testing out the .NET 4.0 Parallel libraries. So we wrapped the loop in a: Parallel.ForEach(); After doing this, we start getting "random" Timeout Exceptions from SQL Server, and finally, after leaving it running all night, OutOfMemory unhandled exceptions. I haven't done deep debugging on this yet, but what do you guys think. Is this simply a limitation of SQL Server, or could it be our NHibernate setup, or what? cheers andy

    Read the article

  • NHibernate, the Parallel Framework, and SQL Server

    - by andy
    hey guys, we have a loop that: 1.Loops over several thousand xml files. Altogether we're parsing millions of "user" nodes. 2.In each iteration we parse a "user" xml, do custom deserialization 3.finally, in each iteration, we send our object to nhibernate for saving. We use: .SaveOrUpdateAndFlush(user); This is a lengthy process, and we thought it would be a perfect candidate for testing out the .NET 4.0 Parallel libraries. So we wrapped the loop in a: Parallel.ForEach(); After doing this, we start getting "random" Timeout Exceptions from SQL Server, and finally, after leaving it running all night, OutOfMemory unhandled exceptions. I haven't done deep debugging on this yet, but what do you guys think. Is this simply a limitation of SQL Server, or could it be our NHibernate setup, or what? cheers andy

    Read the article

  • When implementing a microsoft.build.utilities.task how to i get access to the various environmental

    - by Simon
    When implementing a microsoft.build.utilities.task how to i get access to the various environmental variables of the build? For example "TargetPath" I know i can pass it in as part of the task XML <MyTask TargetPath="$(TargetPath)" /> But i don't want to force the consumer of the task to have to do that if I can access the variable in code. http://msdn.microsoft.com/en-us/library/microsoft.build.utilities.task.aspx

    Read the article

  • Using Parallel Extensions In Web Applications

    - by Greg
    I'd like to hear some opinions as to what role, if any, parallel computing approaches, including the potential use of the parallel extensions (June CTP for example), have a in web applications. What scenarios does this approach fit and/or not fit for? My understanding of how exactly IIS and web browsers thread tasks is fairly limited. I would appreciate some insight on that if someone out there has a good understanding. I'm more curious to know if the way that IIS and web browsers work limits the ROI of creating threaded and/or asynchronous tasks in web applications in general. Thanks in advance.

    Read the article

  • Mock Assertions on objects inside Parallel ForEach's???

    - by jacko
    Any idea how we can assert a mock object was called when it is being accessed inside Parallel.ForEach via a closure? I assume that because each invocation is on a different thread that Rhino Mocks loses track of the object? Pseudocode: var someStub = MockRepository.GenerateStub() Parallel.Foreach(collectionOfInts, anInt => someStub.DoSomething(anInt)) someStub.AssertWasCalled(s => s.DoSomething, Repeat.Five.Times) This test will return an expectation violation, expecting the stub to be called 5 times but being actually called 0 times. Any ideas how we can tell the lambdas to keep track of the thread-local stub object?

    Read the article

  • Parallel.For maintain input list order on output list

    - by romeozor
    I'd like some input on keeping the order of a list during heavy-duty operations that I decided to try to do in a parallel manner to see if it boosts performance. (It did!) I came up with a solution, but since this was my first attempt at anything parallel, I'd need someone to slap my hands if I did something very stupid. There's a query that returns a list of card owners, sorted by name, then by date of birth. This needs to be rendered in a table on a web page (ASP.Net WebForms). The original coder decided he would construct the table cell-by-cell (TableCell), add them to rows (TableRow), then each row to the table. So no GridView, allegedly its performance is bad, but the performance was very poor regardless :). The database query returns in no time, the most time is spent on looping through the results and adding table cells etc. I made the following method to maintain the original order of the list: private TableRow[] ComposeRows(List<CardHolder> queryResult) { int queryElementsCount = queryResult.Count(); // array with the query's size var rowArray = new TableRow[queryElementsCount]; Parallel.For(0, queryElementsCount, i => { var row = new TableRow(); var cell = new TableCell(); // various operations, including simple ones such as: cell.Text = queryResult[i].Name; row.Cells.Add(cell); // here I'm adding the current item to it's original index // to maintain order in the output list rowArray[i] = row; }); return rowArray; } So as you can see, because I'm returning a very different type of data (List<CardHolder> -> TableRow[]), I can't just simply omit the ordering from the original query to do it after the operations. Also, I also thought it would be a good idea to Dispose() the objects at the end of each loop, because the query can return a huge list and letting cell and row objects pile up in the heap could impact performance.(?) How badly did I do? Does anyone have a better solution in case mine is flawed?

    Read the article

  • Parallel.For(): Update variable outside of loop

    - by TSS
    I'm just looking in to the new .NET 4.0 features. With that, I'm attempting a simple calculation using Parallel.For and a normal for(x;x;x) loop. However, I'm getting different results about 50% of the time. long sum = 0; Parallel.For(1, 10000, y => { sum += y; } ); Console.WriteLine(sum.ToString()); sum = 0; for (int y = 1; y < 10000; y++) { sum += y; } Console.WriteLine(sum.ToString()); My guess is that the threads are trying to update "sum" at the same time. Is there an obvious way around it?

    Read the article

  • Parallel.For(): Different results for simple addition

    - by TSS
    I'm just looking in to the new .NET 4.0 features. With that, I'm attempting a simple calculation using Parallel.For and a normal for(x;x;x) loop. However, I'm getting different results about 50% of the time (no code change). long sum = 0; Parallel.For(1, 10000, y => { sum += y; }); Console.WriteLine(sum.ToString()); sum = 0; for (int y = 1; y < 10000; y++) { sum += y; } Console.WriteLine(sum.ToString()); Surely I'm missing something simple or do not completely understand the concept. My guess is that the threads are trying to update "sum" at the same time. If so, is there an obvious way around it?

    Read the article

  • Windows Media Player 12 Library import keeps dying

    - by duckworth
    I cannot get WMP 12 to import my library. I have searched around various forums and tried all the common solutions like disabling Media Sharing, deleted my %LOCALAPPDATA%\Microsoft\Media Player directory and tried reimporting, etc. but nothing works. I have even removed the Media features from Windows setup and re-added them. I have a large mp3 collection shared on the network from another Windows box. I add the folder (tried as a mapped drive and UNC path) and it begins importing. After about 30 minutes into the import (the CurrentDatabase_372.wmdb hits just under 400MB) my WMP player stops importing and all of the icons in WMP turn to red x's and my library is gone. I close and reopen WMP 12 and the library is empty and the CurrentDatabase_372.wmdb is small and it strarts importing again. Rinse, lather, repeat. I am going nuts as WMP11 on Vista handles this same setup perfectly. I am at my wits end on what else to try. I am running a legit Windows 7 Ultimate X64 RTM install. Here is a screenshot of what WMP12 looks like when the import dies: Any other ideas? Edit: OK, I Just confirmed this is definitely a problem not specific to my computer or configuration. I just did a clean installation of Windows 7 Ultimate x86 on an old test machine, opened WMP12 and added the same network folder of mp3's and it crashed about an hour into the import with the same appearance as the screenshot I posted above and the library disappears. So the problem has to be one of several things: The large size of the library The fact that the library is on the network A specific file or file is causing it the player to crash

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >