Search Results

Search found 10161 results on 407 pages for 'task flow'.

Page 49/407 | < Previous Page | 45 46 47 48 49 50 51 52 53 54 55 56  | Next Page >

  • Syntax of passing lambda

    - by Astara
    Right now, I'm working on refactoring a program that calls its parts by polling to a more event-driven structure. I've created sched and task classes with the sced to become a base class of the current main loop. The tasks will be created for each meter so they can be called off of that instead of polling. Each of the events main calls are a type of meter that gather info and display it. When the program is coming up, all enabled meters get 'constructed' by a main-sub. In that sub, I want to store off the "this" pointer associated with the meter, as well as the common name for the "action routine. void MeterMaker::Meter_n_Task (Meter * newmeter,) { push(newmeter); // handle non-timed draw events Task t = new Task(now() + 0.5L); t.period={0,1U}; t.work_meter = newmeter; t.work = [&newmeter](){newmeter.checkevent();};<<--attempt at lambda t.flags = T_Repeat; t.enable_task(); _xos->sched_insert(t); } A sample call to it: Meter_n_Task(new CPUMeter(_xos, "CPU ")); 've made the scheduler a base class of the main routine (that handles the loop), and I've tried serveral variations to get the task class to be a base of the meter class, but keep running into roadblocks. It's alot like "whack-a-mole" -- pound in something to fix something one place, and then a new probl pops out elsewhere. Part of the problem, is that the sched.h file that is trying to hold the Task Q, includes the Task header file. The task file Wants to refer to the most "base", Meter class. The meter class pulls in the main class of the parent as it passes a copy of the parent to the children so they can access the draw routines in the parent. Two references in the task file are for the 'this' pointer of the meter and the meter's update sub (to be called via this). void *this_data= NULL; void (*this_func)() = NULL; Note -- I didn't really want to store these in the class, as I wanted to use a lamdba in that meter&task routine above to store a routine+context to be used to call the meter's action routine. Couldn't figure out the syntax. But am running into other syntax problems trying to store the pointers...such as g++: COMPILE lsched.cc In file included from meter.h:13:0, from ltask.h:17, from lsched.h:13, from lsched.cc:13: xosview.h:30:47: error: expected class-name before ‘{’ token class XOSView : public XWin, public Scheduler { Like above where it asks for a class, where the classname "Scheduler" is. !?!? Huh? That IS a class name. I keep going in circles with things that don't make sense... Ideally I'd get the lamba to work right in the Meter_n_Task routine at the top. I wanted to only store 1 pointer in the 'Task' class that was a pointer to my lambda that would have already captured the "this" value ... but couldn't get that syntax to work at all when I tried to start it into a var in the 'Task' class. This project, FWIW, is my teething project on the new C++... (of course it's simple!.. ;-))... I've made quite a bit of progress in other areas in the code, but this lambda syntax has me stumped...its at times like thse that I appreciate the ease of this type of operation in perl. Sigh. Not sure the best way to ask for help here, as this isn't a simple question. But thought I'd try!... ;-) Too bad I can't attach files to this Q.

    Read the article

  • Syntax of passing lambda causing hair loss (pulling out)

    - by Astara
    Right now, I'm working on refactoring a program that calls its parts by polling to a more event-driven structure. I've created sched and task classes with the sced to become a base class of the current main loop. The tasks will be created for each meter so they can be called off of that instead of polling. Each of the events main calls are a type of meter that gather info and display it. When the program is coming up, all enabled meters get 'constructed' by a main-sub. In that sub, I want to store off the "this" pointer associated with the meter, as well as the common name for the "action routine. void MeterMaker::Meter_n_Task (Meter * newmeter,) { push(newmeter); // handle non-timed draw events Task t = new Task(now() + 0.5L); t.period={0,1U}; t.work_meter = newmeter; t.work = [&newmeter](){newmeter.checkevent();};<<--attempt at lambda t.flags = T_Repeat; t.enable_task(); _xos->sched_insert(t); } A sample call to it: Meter_n_Task(new CPUMeter(_xos, "CPU ")); 've made the scheduler a base class of the main routine (that handles the loop), and I've tried serveral variations to get the task class to be a base of the meter class, but keep running into roadblocks. It's alot like "whack-a-mole" -- pound in something to fix something one place, and then a new probl pops out elsewhere. Part of the problem, is that the sched.h file that is trying to hold the Task Q, includes the Task header file. The task file Wants to refer to the most "base", Meter class. The meter class pulls in the main class of the parent as it passes a copy of the parent to the children so they can access the draw routines in the parent. Two references in the task file are for the 'this' pointer of the meter and the meter's update sub (to be called via this). void *this_data= NULL; void (*this_func)() = NULL; Note -- I didn't really want to store these in the class, as I wanted to use a lamdba in that meter&task routine above to store a routine+context to be used to call the meter's action routine. Couldn't figure out the syntax. But am running into other syntax problems trying to store the pointers...such as g++: COMPILE lsched.cc In file included from meter.h:13:0, from ltask.h:17, from lsched.h:13, from lsched.cc:13: xosview.h:30:47: error: expected class-name before ‘{’ token class XOSView : public XWin, public Scheduler { Like above where it asks for a class, where the classname "Scheduler" is. !?!? Huh? That IS a class name. I keep going in circles with things that don't make sense... Ideally I'd get the lamba to work right in the Meter_n_Task routine at the top. I wanted to only store 1 pointer in the 'Task' class that was a pointer to my lambda that would have already captured the "this" value ... but couldn't get that syntax to work at all when I tried to start it into a var in the 'Task' class. This project, FWIW, is my teething project on the new C++... (of course it's simple!.. ;-))... I've made quite a bit of progress in other areas in the code, but this lambda syntax has me stumped...its at times like thse that I appreciate the ease of this type of operation in perl. Sigh. Not sure the best way to ask for help here, as this isn't a simple question. But thought I'd try!... ;-) Too bad I can't attach files to this Q.

    Read the article

  • TFS API Change WorkItem CreatedDate And ChangedDate To Historic Dates

    - by Tarun Arora
    There may be times when you need to modify the value of the fields “System.CreatedDate” and “System.ChangedDate” on a work item. Richard Hundhausen has a great blog with ample of reason why or why not you should need to set the values of these fields to historic dates. In this blog post I’ll show you, Create a PBI WorkItem linked to a Task work item by pre-setting the value of the field ‘System.ChangedDate’ to a historic date Change the value of the field ‘System.Created’ to a historic date Simulate the historic burn down of a task type work item in a sprint Explain the impact of updating values of the fields CreatedDate and ChangedDate on the Sprint burn down chart Rules of Play      1. You need to be a member of the Project Collection Service Accounts              2. You need to use ‘WorkItemStoreFlags.BypassRules’ when you instantiate the WorkItemStore service // Instanciate Work Item Store with the ByPassRules flag _wis = new WorkItemStore(_tfs, WorkItemStoreFlags.BypassRules);      3. You cannot set the ChangedDate         - Less than the changed date of previous revision         - Greater than current date Walkthrough The walkthrough contains 5 parts 00 – Required References 01 – Connect to TFS Programmatically 02 – Create a Work Item Programmatically 03 – Set the values of fields ‘System.ChangedDate’ and ‘System.CreatedDate’ to historic dates 04 – Results of our experiment Lets get started………………………………………………… 00 – Required References Microsoft.TeamFoundation.dll Microsoft.TeamFoundation.Client.dll Microsoft.TeamFoundation.Common.dll Microsoft.TeamFoundation.WorkItemTracking.Client.dll 01 – Connect to TFS Programmatically I have a in depth blog post on how to connect to TFS programmatically in case you are interested. However, the code snippet below will enable you to connect to TFS using the Team Project Picker. // Services I need access to globally private static TfsTeamProjectCollection _tfs; private static ProjectInfo _selectedTeamProject; private static WorkItemStore _wis; // Connect to TFS Using Team Project Picker public static bool ConnectToTfs() { var isSelected = false; // The user is allowed to select only one project var tfsPp = new TeamProjectPicker(TeamProjectPickerMode.SingleProject, false); tfsPp.ShowDialog(); // The TFS project collection _tfs = tfsPp.SelectedTeamProjectCollection; if (tfsPp.SelectedProjects.Any()) { // The selected Team Project _selectedTeamProject = tfsPp.SelectedProjects[0]; isSelected = true; } return isSelected; } 02 – Create a Work Item Programmatically In the below code snippet I have create a Product Backlog Item and a Task type work item and then link them together as parent and child. Note – You will have to set the ChangedDate to a historic date when you created the work item. Remember, If you try and set the ChangedDate to a value earlier than last assigned you will receive the following exception… TF26212: Team Foundation Server could not save your changes. There may be problems with the work item type definition. Try again or contact your Team Foundation Server administrator. If you notice below I have added a few seconds each time I have modified the ‘ChangedDate’ just to avoid running into the exception listed above. // Create Linked Work Items and return Ids private static List<int> CreateWorkItemsProgrammatically() { // Instantiate Work Item Store with the ByPassRules flag _wis = new WorkItemStore(_tfs, WorkItemStoreFlags.BypassRules); // List of work items to return var listOfWorkItems = new List<int>(); // Create a new Product Backlog Item var p = new WorkItem(_wis.Projects[_selectedTeamProject.Name].WorkItemTypes["Product Backlog Item"]); p.Title = "This is a new PBI"; p.Description = "Description"; p.IterationPath = string.Format("{0}\\Release 1\\Sprint 1", _selectedTeamProject.Name); p.AreaPath = _selectedTeamProject.Name; p["Effort"] = 10; // Just double checking that ByPassRules is set to true if (_wis.BypassRules) { p.Fields["System.ChangedDate"].Value = Convert.ToDateTime("2012-01-01"); } if (p.Validate().Count == 0) { p.Save(); listOfWorkItems.Add(p.Id); } else { Console.WriteLine(">> Following exception(s) encountered during work item save: "); foreach (var e in p.Validate()) { Console.WriteLine(" - '{0}' ", e); } } var t = new WorkItem(_wis.Projects[_selectedTeamProject.Name].WorkItemTypes["Task"]); t.Title = "This is a task"; t.Description = "Task Description"; t.IterationPath = string.Format("{0}\\Release 1\\Sprint 1", _selectedTeamProject.Name); t.AreaPath = _selectedTeamProject.Name; t["Remaining Work"] = 10; if (_wis.BypassRules) { t.Fields["System.ChangedDate"].Value = Convert.ToDateTime("2012-01-01"); } if (t.Validate().Count == 0) { t.Save(); listOfWorkItems.Add(t.Id); } else { Console.WriteLine(">> Following exception(s) encountered during work item save: "); foreach (var e in t.Validate()) { Console.WriteLine(" - '{0}' ", e); } } var linkTypEnd = _wis.WorkItemLinkTypes.LinkTypeEnds["Child"]; p.Links.Add(new WorkItemLink(linkTypEnd, t.Id) {ChangedDate = Convert.ToDateTime("2012-01-01").AddSeconds(20)}); if (_wis.BypassRules) { p.Fields["System.ChangedDate"].Value = Convert.ToDateTime("2012-01-01").AddSeconds(20); } if (p.Validate().Count == 0) { p.Save(); } else { Console.WriteLine(">> Following exception(s) encountered during work item save: "); foreach (var e in p.Validate()) { Console.WriteLine(" - '{0}' ", e); } } return listOfWorkItems; } 03 – Set the value of “Created Date” and Change the value of “Changed Date” to Historic Dates The CreatedDate can only be changed after a work item has been created. If you try and set the CreatedDate to a historic date at the time of creation of a work item, it will not work. // Lets do a work item effort burn down simulation by updating the ChangedDate & CreatedDate to historic Values private static void WorkItemChangeSimulation(IEnumerable<int> listOfWorkItems) { foreach (var id in listOfWorkItems) { var wi = _wis.GetWorkItem(id); switch (wi.Type.Name) { case "ProductBacklogItem": if (wi.State.ToLower() == "new") wi.State = "Approved"; // Advance the changed date by few seconds wi.Fields["System.ChangedDate"].Value = Convert.ToDateTime(wi.Fields["System.ChangedDate"].Value).AddSeconds(10); // Set the CreatedDate to Changed Date wi.Fields["System.CreatedDate"].Value = Convert.ToDateTime(wi.Fields["System.ChangedDate"].Value).AddSeconds(10); wi.Save(); break; case "Task": // Advance the changed date by few seconds wi.Fields["System.ChangedDate"].Value = Convert.ToDateTime(wi.Fields["System.ChangedDate"].Value).AddSeconds(10); // Set the CreatedDate to Changed date wi.Fields["System.CreatedDate"].Value = Convert.ToDateTime(wi.Fields["System.ChangedDate"].Value).AddSeconds(10); wi.Save(); break; } } // A mock sprint start date var sprintStart = DateTime.Today.AddDays(-5); // A mock sprint end date var sprintEnd = DateTime.Today.AddDays(5); // What is the total Sprint duration var totalSprintDuration = (sprintEnd - sprintStart).Days; // How much of the sprint have we already covered var noOfDaysIntoSprint = (DateTime.Today - sprintStart).Days; // Get the effort assigned to our tasks var totalEffortRemaining = QueryTaskTotalEfforRemaining(listOfWorkItems); // Defining how much effort to burn every day decimal dailyBurnRate = totalEffortRemaining / totalSprintDuration < 1 ? 1 : totalEffortRemaining / totalSprintDuration; // we have just created one task var totalNoOfTasks = 1; var simulation = sprintStart; var currentDate = DateTime.Today.Date; // Carry on till effort has been burned down from sprint start to today while (simulation.Date != currentDate.Date) { var dailyBurnRate1 = dailyBurnRate; // A fixed amount needs to be burned down each day while (dailyBurnRate1 > 0) { // burn down bit by bit from all unfinished task type work items foreach (var id in listOfWorkItems) { var wi = _wis.GetWorkItem(id); var isDirty = false; // Set the status to in progress if (wi.State.ToLower() == "to do") { wi.State = "In Progress"; isDirty = true; } // Ensure that there is enough effort remaining in tasks to burn down the daily burn rate if (QueryTaskTotalEfforRemaining(listOfWorkItems) > dailyBurnRate1) { // If there is less than 1 unit of effort left in the task, burn it all if (Convert.ToDecimal(wi["Remaining Work"]) <= 1) { wi["Remaining Work"] = 0; dailyBurnRate1 = dailyBurnRate1 - Convert.ToDecimal(wi["Remaining Work"]); isDirty = true; } else { // How much to burn from each task? var toBurn = (dailyBurnRate / totalNoOfTasks) < 1 ? 1 : (dailyBurnRate / totalNoOfTasks); // Check that the task has enough effort to allow burnForTask effort if (Convert.ToDecimal(wi["Remaining Work"]) >= toBurn) { wi["Remaining Work"] = Convert.ToDecimal(wi["Remaining Work"]) - toBurn; dailyBurnRate1 = dailyBurnRate1 - toBurn; isDirty = true; } else { wi["Remaining Work"] = 0; dailyBurnRate1 = dailyBurnRate1 - Convert.ToDecimal(wi["Remaining Work"]); isDirty = true; } } } else { dailyBurnRate1 = 0; } if (isDirty) { if (Convert.ToDateTime(wi.Fields["System.ChangedDate"].Value).Date == simulation.Date) { wi.Fields["System.ChangedDate"].Value = Convert.ToDateTime(wi.Fields["System.ChangedDate"].Value).AddSeconds(20); } else { wi.Fields["System.ChangedDate"].Value = simulation.AddSeconds(20); } wi.Save(); } } } // Increase date by 1 to perform daily burn down by day simulation = Convert.ToDateTime(simulation).AddDays(1); } } // Get the Total effort remaining in the current sprint private static decimal QueryTaskTotalEfforRemaining(List<int> listOfWorkItems) { var unfinishedWorkInCurrentSprint = _wis.GetQueryDefinition( new Guid(QueryAndGuid.FirstOrDefault(c => c.Key == "Unfinished Work").Value)); var parameters = new Dictionary<string, object> { { "project", _selectedTeamProject.Name } }; var q = new Query(_wis, unfinishedWorkInCurrentSprint.QueryText, parameters); var results = q.RunLinkQuery(); var wis = new List<WorkItem>(); foreach (var result in results) { var _wi = _wis.GetWorkItem(result.TargetId); if (_wi.Type.Name == "Task" && listOfWorkItems.Contains(_wi.Id)) wis.Add(_wi); } return wis.Sum(r => Convert.ToDecimal(r["Remaining Work"])); }   04 – The Results If you are still reading, the results are beautiful! Image 1 – Create work item with Changed Date pre-set to historic date Image 2 – Set the CreatedDate to historic date (Same as the ChangedDate) Image 3 – Simulate of effort burn down on a task via the TFS API   Image 4 – The history of changes on the Task. So, essentially this task has burned 1 hour per day Sprint Burn Down Chart – What’s not possible? The Sprint burn down chart is calculated from the System.AuthorizedDate and not the System.ChangedDate/System.CreatedDate. So, though you can change the System.ChangedDate and System.CreatedDate to historic dates you will not be able to synthesize the sprint burn down chart. Image 1 – By changing the Created Date and Changed Date to ‘18/Oct/2012’ you would have expected the burn down to have been impacted, but it won’t be, because the sprint burn down chart uses the value of field ‘System.AuthorizedDate’ to calculate the unfinished work points. The AsOf queries that are used to calculate the unfinished work points use the value of the field ‘System.AuthorizedDate’. Image 2 – Using the above code I burned down 1 hour effort per day over 5 days from the task work item, I would have expected the sprint burn down to show a constant burn down, instead the burn down shows the effort exhausted on the 24th itself. Simply because the burn down is calculated using the ‘System.AuthorizedDate’. Now you would ask… “Can I change the value of the field System.AuthorizedDate to a historic date” Unfortunately that’s not possible! You will run into the exception ValidationException –  “TF26194: The value for field ‘Authorized Date’ cannot be changed.” Conclusion - You need to be a member of the Project Collection Service account group in order to set the fields ‘System.ChangedDate’ and ‘System.CreatedDate’ to historic dates - You need to instantiate the WorkItemStore using the flag ByPassValidation - The System.ChangedDate needs to be set to a historic date at the time of work item creation. You cannot reset the ChangedDate to a date earlier than the existing ChangedDate and you cannot reset the ChangedDate to a date greater than the current date time. - The System.CreatedDate can only be reset after a work item has been created. You cannot set the CreatedDate at the time of work item creation. The CreatedDate cannot be greater than the current date. You can however reset the CreatedDate to a date earlier than the existing value. - You will not be able to synthesize the Sprint burn down chart by changing the value of System.ChangedDate and System.CreatedDate to historic dates, since the burn down chart uses AsOf queries to calculate the unfinished work points which internally uses the System.AuthorizedDate and NOT the System.ChangedDate & System.CreatedDate - System.AuthorizedDate cannot be set to a historic date using the TFS API Read other posts on using the TFS API here… Enjoy!

    Read the article

  • rake migration aborted: could not find table 'roles'

    - by user464180
    I just inherited code that I'm attempting to run the migrations for but I keep getting a rake aborted error. I've come across others that have what appears to be similar issues, but most involved Heroku and I'm trying to run this locally (to start.) I've tried troubleshooting using both PostgreSQL and SQLite, and both produce the same issue. The table "roles" referenced is the second migration called, so I'm having a hard time figuring out what is causing it to not get built. Any and all assistance is greatly appreciated. Thanks in advance. Here's the roles migration: class CreateRoles < ActiveRecord::Migration def change create_table :roles do |t| t.string :name t.timestamps end end end Here is the trace for SQLite: ** Invoke db:migrate (first_time) ** Invoke environment (first_time) ** Execute environment rake aborted! Could not find table 'roles' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/connection_adapters/sqlite_adapter.rb:470:in `table_structure' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/connection_adapters/sqlite_adapter.rb:351:in `columns' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/connection_adapters/schema_cache.rb:12:in `block in initialize' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/model_schema.rb:228:in `yield' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/model_schema.rb:228:in `default' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/model_schema.rb:228:in `columns' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/model_schema.rb:248:in `column_names' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/model_schema.rb:261:in `column_methods_hash' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/dynamic_matchers.rb:69:in `all_attributes_exists?' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/dynamic_matchers.rb:27:in `method_missing' /Users/sa/Documents/AptanaWorkspace/recprototype/config/initializ ers/constants.rb:1:in `<top (required)>' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:245:in `load' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:245:in `block in load' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:236:in `load_dependency' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:245:in `load' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/engi ne.rb:588:in `block (2 levels) in <class:Engine>' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/engi ne.rb:587:in `each' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/engi ne.rb:587:in `block in <class:Engine>' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/init ializable.rb:30:in `instance_exec' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/init ializable.rb:30:in `run' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/init ializable.rb:55:in `block in run_initializers' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/init ializable.rb:54:in `each' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/init ializable.rb:54:in `run_initializers' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/appl ication.rb:136:in `initialize!' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/rail tie/configurable.rb:30:in `method_missing' /Users/sa/Documents/AptanaWorkspace/recprototype/config/environme nt.rb:5:in `<top (required)>' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:251:in `require' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:251:in `block in require' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:236:in `load_dependency' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:251:in `require' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/appl ication.rb:103:in `require_environment!' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/appl ication.rb:292:in `block (2 levels) in initialize_tasks' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :205:in `call' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :205:in `block in execute' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :200:in `each' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :200:in `execute' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :158:in `block in invoke_with_call_chain' /Users/sa/.rvm/rubies/ruby-1.9.2-p318/lib/ruby/1.9.1/monitor.rb:201:in `mon_synchronize' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :151:in `invoke_with_call_chain' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :176:in `block in invoke_prerequisites' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :174:in `each' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :174:in `invoke_prerequisites' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :157:in `block in invoke_with_call_chain' /Users/sa/.rvm/rubies/ruby-1.9.2-p318/lib/ruby/1.9.1/monitor.rb:201:in `mon_synchronize' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :151:in `invoke_with_call_chain' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :144:in `invoke' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:116:in `invoke_task' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:94:in `block (2 levels) in top_level' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:94:in `each' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:94:in `block in top_level' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:133:in `standard_exception_handling' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:88:in `top_level' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:66:in `block in run' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:133:in `standard_exception_handling' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:63:in `run' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/bin/rake:33:in ` <top (required)>' /Users/sa/.rvm/gems/ruby-1.9.2-p318/bin/rake:19:in `load' /Users/sa/.rvm/gems/ruby-1.9.2-p318/bin/rake:19:in `<main>' Tasks: TOP => db:migrate => environment Here is the trace for PostgreSQL: ** Invoke db:migrate (first_time) ** Invoke environment (first_time) ** Execute environment rake aborted! PG::Error: ERROR: relation "roles" does not exist LINE 4: WHERE a.attrelid = '"roles"'::regclass ^ : SELECT a.attname, format_type(a.atttypid, a.atttypmod), d.adsrc, a .attnotnull FROM pg_attribute a LEFT JOIN pg_attrdef d ON a.attrelid = d.adrelid AND a.attnum = d.adnum WHERE a.attrelid = '"roles"'::regclass AND a.attnum > 0 AND NOT a.attisdropped ORDER BY a.attnum /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/connection_adapters/postgresql_adapter.rb:1106:in `async_exec' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/connection_adapters/postgresql_adapter.rb:1106:in `exec_no_cache' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/connection_adapters/postgresql_adapter.rb:650:in `block in exec_query' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/connection_adapters/abstract_adapter.rb:280:in `block in log' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/notifications/instrumenter.rb:20:in `instrument' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/connection_adapters/abstract_adapter.rb:275:in `log' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/connection_adapters/postgresql_adapter.rb:649:in `exec_query' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/connection_adapters/postgresql_adapter.rb:1231:in `column_definitions' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/connection_adapters/postgresql_adapter.rb:845:in `columns' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/connection_adapters/schema_cache.rb:12:in `block in initialize' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/model_schema.rb:228:in `yield' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/model_schema.rb:228:in `default' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/model_schema.rb:228:in `columns' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/model_schema.rb:248:in `column_names' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/model_schema.rb:261:in `column_methods_hash' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/dynamic_matchers.rb:69:in `all_attributes_exists?' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activerecord-3.2.1/lib/active _record/dynamic_matchers.rb:27:in `method_missing' /Users/sa/Documents/AptanaWorkspace/recprototype/config/initializ ers/constants.rb:1:in `<top (required)>' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:245:in `load' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:245:in `block in load' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:236:in `load_dependency' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:245:in `load' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/engi ne.rb:588:in `block (2 levels) in <class:Engine>' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/engi ne.rb:587:in `each' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/engi ne.rb:587:in `block in <class:Engine>' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/init ializable.rb:30:in `instance_exec' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/init ializable.rb:30:in `run' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/init ializable.rb:55:in `block in run_initializers' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/init ializable.rb:54:in `each' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/init ializable.rb:54:in `run_initializers' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/appl ication.rb:136:in `initialize!' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/rail tie/configurable.rb:30:in `method_missing' /Users/sa/Documents/AptanaWorkspace/recprototype/config/environme nt.rb:5:in `<top (required)>' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:251:in `require' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:251:in `block in require' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:236:in `load_dependency' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/activesupport-3.2.1/lib/activ e_support/dependencies.rb:251:in `require' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/appl ication.rb:103:in `require_environment!' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/railties-3.2.1/lib/rails/appl ication.rb:292:in `block (2 levels) in initialize_tasks' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :205:in `call' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :205:in `block in execute' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :200:in `each' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :200:in `execute' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :158:in `block in invoke_with_call_chain' /Users/sa/.rvm/rubies/ruby-1.9.2-p318/lib/ruby/1.9.1/monitor.rb:201:in `mon_synchronize' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :151:in `invoke_with_call_chain' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :176:in `block in invoke_prerequisites' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :174:in `each' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :174:in `invoke_prerequisites' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :157:in `block in invoke_with_call_chain' /Users/sa/.rvm/rubies/ruby-1.9.2-p318/lib/ruby/1.9.1/monitor.rb:201:in `mon_synchronize' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :151:in `invoke_with_call_chain' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/task.rb :144:in `invoke' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:116:in `invoke_task' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:94:in `block (2 levels) in top_level' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:94:in `each' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:94:in `block in top_level' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:133:in `standard_exception_handling' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:88:in `top_level' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:66:in `block in run' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:133:in `standard_exception_handling' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/lib/rake/applica tion.rb:63:in `run' /Users/sa/.rvm/gems/ruby-1.9.2-p318/gems/rake-0.9.2.2/bin/rake:33:in ` <top (required)>' /Users/sa/.rvm/gems/ruby-1.9.2-p318/bin/rake:19:in `load' /Users/sa/.rvm/gems/ruby-1.9.2-p318/bin/rake:19:in `<main>' Tasks: TOP => db:migrate => environment

    Read the article

  • Ant MXMLC task with arbitrary list of source/lib paths?

    - by sascha
    Does anyone know of a way to use the mxmlc task of the Flex Ant tasks with a user-definable list of source path or library paths? The idea is that the user can define an arbitrary list of source paths and/or library (swc) paths into an Ant properties file and the build file takes these values and evaluates them for use in the mxmlc task. Just wondering if there are any tricks (maybe utilizing filtering/string replacing) to get this working?

    Read the article

  • Non Document Centric SharePoint Workflow

    - by Dan Revell
    SharePoint workflows are document centric in that the base thing the workflow runs on has to be a thing; be it a document or just a list item. The workflow itself is task based, so stuff a user has to do. Now I can put any sort of code in these tasks that I want to and even put complex InfoPath forms in for the user to perform the task. This has been fine on all my previous workflows. But what if I want the tasks to be actual official forms themselves. The item that the workflow runs on is just some abstract concept like an event. An example could be an accident has happened. There isn't an accident form, but a whole set of forms that need to be completed by different people. Task forms aren't really a nice way to go, because it locks all the forms into the task list. You can only access the forms by not deleting the tasks when complete and going to the workflow summery and following the task links to the InfoPath forms or going straight to the tasks list and doing a filter on particular "accidents". These are official documents so ideally there would be a library for each type of document and the workflow would orchestrate the completion of the right forms. It would mean each task would have to create a new blank form and then link the user to that form. The user would go complete the form but then have to go back to the task form and click yes I've completed it until the workflow could progress. Well this is short of the workflow monitoring the forms library form for some completion trigger. But then it all gets messy with the user experience from clicking the link in the task email, to open the Infopath task form, to clicking the link in the subsequent Infopath library form and then return through these forms on completion. It just gets messy trying to retrofit this non document centric sort of workflow into SharePoint. I would really appreciate any input on what might be the best way to do this. Store the forms as task forms Store the forms as library forms and create/link from the task forms Store the forms as different infopath views, and use a forms library. The workflow would trigger variables that progress the view the infopath form shows. Using the same form template for both task forms and a forms library and when a task form is complete, copy the xml into the forms library to have a official record outside of the workflow. Thanks

    Read the article

  • Troubles with list "dropdowns" and which list item gets the dropdown

    - by Andrew
    I'm working on a project for an MMO "guild" that gives members of the guild randomly generated tasks for the game. They can "block" three tasks from being assigned. The lists will look something like this: <ul> <li class="blocked">Task that is blocked</li> <li class="blocked-open">Click to block a task</li> <li class="blocked-open">Click to block a task</li> </ul> The blocked-open class means they haven't chosen a task to block yet. The blocked task means they've already blocked a task. When they click the list item, I want this to appear: <ul class="tasks-dropdown no-display"> <li><h1>Click a Task to Block</h1></li> <ul class="task-dropdown-inner"> <?php //output all tasks foreach($tasks as $task) { echo '<li class="blocked-option"><span id="'.$task.'">'.$task.'</span></li>'; } ?> <br class="clear" /> </ul> </ul> I don't quite know how, when the user clicks the .blocked-open line-item, to show that dropdown under only the one they clicked. My jQuery looked like this before I became confused. $("li.blocked-open").click(function() { $("ul.no-display").slideToggle("900"); }); $(".blocked-option span").click(function() { var task = $(this).attr('id'); alert("You have blocked: " + task); location.reload(true); }); I tested it by putting the dropdown under a line item in the code, and it worked fine, but when I have more than one dropdown in the code, clicking on one line item toggles all the dropdowns. I'm not sure what to do. :-p.

    Read the article

  • How to assign Application Icon that will display in Task bar?

    - by viky
    I am working on a Wpf desktop application, whenever i run my application it shows me a window and associated tab in the task bar(Normal windows feature). My problem is that the tab is using window's icon for unknown file-type, I tried with Icon property of Window, Icon gets assigned but still problem is when I run application, task bar Tab initially displays window's icon for unknown file-type and when window-load completes it changes to the Icon assigned. I want Icon there from beginning. Any help?

    Read the article

  • asp.net Web server control with child controls, event not firing

    - by bleeeah
    I have a simple web control (TaskList) that can have children (Task) which inherit from LinkButton, that can be added declaratively or programatically. This works ok, but I can't get the onclick event of a Task to be fired in my code behind. The code .. [ToolboxData("<{0}:TaskList runat=\"server\"> </{0}:TaskList>")] [ParseChildren(true)] [PersistChildren(false)] public class TaskList : System.Web.UI.Control { //[DefaultProperty("Text")] public TaskList() {} private List<Task> _taskList = new List<Task>(); private string _taskHeading = ""; public string Heading { get { return this._taskHeading; } set { this._taskHeading = value; } } [NotifyParentProperty(true)] [PersistenceMode(PersistenceMode.InnerProperty)] [DesignerSerializationVisibility(DesignerSerializationVisibility.Content)] public List<Task> Tasks { get { return this._taskList; } set { this._taskList = value; } } protected override void CreateChildControls() { foreach (Task task in this._taskList) this.Controls.Add(task); base.CreateChildControls(); } protected override void Render(HtmlTextWriter writer) { writer.Write("<h2>" + this._taskHeading + "</h2>"); writer.Write("<div class='tasks_container'>"); writer.Write("<div class='tasks_list'>"); writer.Write("<ul>"); foreach (Task task in this._taskList) { writer.Write("<li>"); task.RenderControl(writer); writer.Write("</li>"); } writer.Write("</ul>"); writer.Write("</div>"); writer.Write("</div>"); } } public class Task : LinkButton { private string _key = ""; public string Key { get { return this._key; } set { this._key = value; } } } Markup: <rf:TaskList runat="server" ID="tskList" Heading="Tasks"> <Tasks> <rf:Task Key="ba" ID="L1" Text="Helllo" OnClick="task1_Click" runat="server" /> </Tasks> </rf:TaskList> The Onclick event task1_Click never fires when clicked (although a postback occurs).

    Read the article

  • Scrum in 5 Minutes

    - by Stephen.Walther
    The goal of this blog entry is to explain the basic concepts of Scrum in less than five minutes. You learn how Scrum can help a team of developers to successfully complete a complex software project. Product Backlog and the Product Owner Imagine that you are part of a team which needs to create a new website – for example, an e-commerce website. You have an overwhelming amount of work to do. You need to build (or possibly buy) a shopping cart, install an SSL certificate, create a product catalog, create a Facebook page, and at least a hundred other things that you have not thought of yet. According to Scrum, the first thing you should do is create a list. Place the highest priority items at the top of the list and the lower priority items lower in the list. For example, creating the shopping cart and buying the domain name might be high priority items and creating a Facebook page might be a lower priority item. In Scrum, this list is called the Product Backlog. How do you prioritize the items in the Product Backlog? Different stakeholders in the project might have different priorities. Gary, your division VP, thinks that it is crucial that the e-commerce site has a mobile app. Sally, your direct manager, thinks taking advantage of new HTML5 features is much more important. Multiple people are pulling you in different directions. According to Scrum, it is important that you always designate one person, and only one person, as the Product Owner. The Product Owner is the person who decides what items should be added to the Product Backlog and the priority of the items in the Product Backlog. The Product Owner could be the customer who is paying the bills, the project manager who is responsible for delivering the project, or a customer representative. The critical point is that the Product Owner must always be a single person and that single person has absolute authority over the Product Backlog. Sprints and the Sprint Backlog So now the developer team has a prioritized list of items and they can start work. The team starts implementing the first item in the Backlog — the shopping cart — and the team is making good progress. Unfortunately, however, half-way through the work of implementing the shopping cart, the Product Owner changes his mind. The Product Owner decides that it is much more important to create the product catalog before the shopping cart. With some frustration, the team switches their developmental efforts to focus on implementing the product catalog. However, part way through completing this work, once again the Product Owner changes his mind about the highest priority item. Getting work done when priorities are constantly shifting is frustrating for the developer team and it results in lower productivity. At the same time, however, the Product Owner needs to have absolute authority over the priority of the items which need to get done. Scrum solves this conflict with the concept of Sprints. In Scrum, a developer team works in Sprints. At the beginning of a Sprint the developers and the Product Owner agree on the items from the backlog which they will complete during the Sprint. This subset of items from the Product Backlog becomes the Sprint Backlog. During the Sprint, the Product Owner is not allowed to change the items in the Sprint Backlog. In other words, the Product Owner cannot shift priorities on the developer team during the Sprint. Different teams use Sprints of different lengths such as one month Sprints, two-week Sprints, and one week Sprints. For high-stress, time critical projects, teams typically choose shorter sprints such as one week sprints. For more mature projects, longer one month sprints might be more appropriate. A team can pick whatever Sprint length makes sense for them just as long as the team is consistent. You should pick a Sprint length and stick with it. Daily Scrum During a Sprint, the developer team needs to have meetings to coordinate their work on completing the items in the Sprint Backlog. For example, the team needs to discuss who is working on what and whether any blocking issues have been discovered. Developers hate meetings (well, sane developers hate meetings). Meetings take developers away from their work of actually implementing stuff as opposed to talking about implementing stuff. However, a developer team which never has meetings and never coordinates their work also has problems. For example, Fred might get stuck on a programming problem for days and never reach out for help even though Tom (who sits in the cubicle next to him) has already solved the very same problem. Or, both Ted and Fred might have started working on the same item from the Sprint Backlog at the same time. In Scrum, these conflicting needs – limiting meetings but enabling team coordination – are resolved with the idea of the Daily Scrum. The Daily Scrum is a meeting for coordinating the work of the developer team which happens once a day. To keep the meeting short, each developer answers only the following three questions: 1. What have you done since yesterday? 2. What do you plan to do today? 3. Any impediments in your way? During the Daily Scrum, developers are not allowed to talk about issues with their cat, do demos of their latest work, or tell heroic stories of programming problems overcome. The meeting must be kept short — typically about 15 minutes. Issues which come up during the Daily Scrum should be discussed in separate meetings which do not involve the whole developer team. Stories and Tasks Items in the Product or Sprint Backlog – such as building a shopping cart or creating a Facebook page – are often referred to as User Stories or Stories. The Stories are created by the Product Owner and should represent some business need. Unlike the Product Owner, the developer team needs to think about how a Story should be implemented. At the beginning of a Sprint, the developer team takes the Stories from the Sprint Backlog and breaks the stories into tasks. For example, the developer team might take the Create a Shopping Cart story and break it into the following tasks: · Enable users to add and remote items from shopping cart · Persist the shopping cart to database between visits · Redirect user to checkout page when Checkout button is clicked During the Daily Scrum, members of the developer team volunteer to complete the tasks required to implement the next Story in the Sprint Backlog. When a developer talks about what he did yesterday or plans to do tomorrow then the developer should be referring to a task. Stories are owned by the Product Owner and a story is all about business value. In contrast, the tasks are owned by the developer team and a task is all about implementation details. A story might take several days or weeks to complete. A task is something which a developer can complete in less than a day. Some teams get lazy about breaking stories into tasks. Neglecting to break stories into tasks can lead to “Never Ending Stories” If you don’t break a story into tasks, then you can’t know how much of a story has actually been completed because you don’t have a clear idea about the implementation steps required to complete the story. Scrumboard During the Daily Scrum, the developer team uses a Scrumboard to coordinate their work. A Scrumboard contains a list of the stories for the current Sprint, the tasks associated with each Story, and the state of each task. The developer team uses the Scrumboard so everyone on the team can see, at a glance, what everyone is working on. As a developer works on a task, the task moves from state to state and the state of the task is updated on the Scrumboard. Common task states are ToDo, In Progress, and Done. Some teams include additional task states such as Needs Review or Needs Testing. Some teams use a physical Scrumboard. In that case, you use index cards to represent the stories and the tasks and you tack the index cards onto a physical board. Using a physical Scrumboard has several disadvantages. A physical Scrumboard does not work well with a distributed team – for example, it is hard to share the same physical Scrumboard between Boston and Seattle. Also, generating reports from a physical Scrumboard is more difficult than generating reports from an online Scrumboard. Estimating Stories and Tasks Stakeholders in a project, the people investing in a project, need to have an idea of how a project is progressing and when the project will be completed. For example, if you are investing in creating an e-commerce site, you need to know when the site can be launched. It is not enough to just say that “the project will be done when it is done” because the stakeholders almost certainly have a limited budget to devote to the project. The people investing in the project cannot determine the business value of the project unless they can have an estimate of how long it will take to complete the project. Developers hate to give estimates. The reason that developers hate to give estimates is that the estimates are almost always completely made up. For example, you really don’t know how long it takes to build a shopping cart until you finish building a shopping cart, and at that point, the estimate is no longer useful. The problem is that writing code is much more like Finding a Cure for Cancer than Building a Brick Wall. Building a brick wall is very straightforward. After you learn how to add one brick to a wall, you understand everything that is involved in adding a brick to a wall. There is no additional research required and no surprises. If, on the other hand, I assembled a team of scientists and asked them to find a cure for cancer, and estimate exactly how long it will take, they would have no idea. The problem is that there are too many unknowns. I don’t know how to cure cancer, I need to do a lot of research here, so I cannot even begin to estimate how long it will take. So developers hate to provide estimates, but the Product Owner and other product stakeholders, have a legitimate need for estimates. Scrum resolves this conflict by using the idea of Story Points. Different teams use different units to represent Story Points. For example, some teams use shirt sizes such as Small, Medium, Large, and X-Large. Some teams prefer to use Coffee Cup sizes such as Tall, Short, and Grande. Finally, some teams like to use numbers from the Fibonacci series. These alternative units are converted into a Story Point value. Regardless of the type of unit which you use to represent Story Points, the goal is the same. Instead of attempting to estimate a Story in hours (which is doomed to failure), you use a much less fine-grained measure of work. A developer team is much more likely to be able to estimate that a Story is Small or X-Large than the exact number of hours required to complete the story. So you can think of Story Points as a compromise between the needs of the Product Owner and the developer team. When a Sprint starts, the developer team devotes more time to thinking about the Stories in a Sprint and the developer team breaks the Stories into Tasks. In Scrum, you estimate the work required to complete a Story by using Story Points and you estimate the work required to complete a task by using hours. The difference between Stories and Tasks is that you don’t create a task until you are just about ready to start working on a task. A task is something that you should be able to create within a day, so you have a much better chance of providing an accurate estimate of the work required to complete a task than a story. Burndown Charts In Scrum, you use Burndown charts to represent the remaining work on a project. You use Release Burndown charts to represent the overall remaining work for a project and you use Sprint Burndown charts to represent the overall remaining work for a particular Sprint. You create a Release Burndown chart by calculating the remaining number of uncompleted Story Points for the entire Product Backlog every day. The vertical axis represents Story Points and the horizontal axis represents time. A Sprint Burndown chart is similar to a Release Burndown chart, but it focuses on the remaining work for a particular Sprint. There are two different types of Sprint Burndown charts. You can either represent the remaining work in a Sprint with Story Points or with task hours (the following image, taken from Wikipedia, uses hours). When each Product Backlog Story is completed, the Release Burndown chart slopes down. When each Story or task is completed, the Sprint Burndown chart slopes down. Burndown charts typically do not always slope down over time. As new work is added to the Product Backlog, the Release Burndown chart slopes up. If new tasks are discovered during a Sprint, the Sprint Burndown chart will also slope up. The purpose of a Burndown chart is to give you a way to track team progress over time. If, halfway through a Sprint, the Sprint Burndown chart is still climbing a hill then you know that you are in trouble. Team Velocity Stakeholders in a project always want more work done faster. For example, the Product Owner for the e-commerce site wants the website to launch before tomorrow. Developers tend to be overly optimistic. Rarely do developers acknowledge the physical limitations of reality. So Project stakeholders and the developer team often collude to delude themselves about how much work can be done and how quickly. Too many software projects begin in a state of optimism and end in frustration as deadlines zoom by. In Scrum, this problem is overcome by calculating a number called the Team Velocity. The Team Velocity is a measure of the average number of Story Points which a team has completed in previous Sprints. Knowing the Team Velocity is important during the Sprint Planning meeting when the Product Owner and the developer team work together to determine the number of stories which can be completed in the next Sprint. If you know the Team Velocity then you can avoid committing to do more work than the team has been able to accomplish in the past, and your team is much more likely to complete all of the work required for the next Sprint. Scrum Master There are three roles in Scrum: the Product Owner, the developer team, and the Scrum Master. I’v e already discussed the Product Owner. The Product Owner is the one and only person who maintains the Product Backlog and prioritizes the stories. I’ve also described the role of the developer team. The members of the developer team do the work of implementing the stories by breaking the stories into tasks. The final role, which I have not discussed, is the role of the Scrum Master. The Scrum Master is responsible for ensuring that the team is following the Scrum process. For example, the Scrum Master is responsible for making sure that there is a Daily Scrum meeting and that everyone answers the standard three questions. The Scrum Master is also responsible for removing (non-technical) impediments which the team might encounter. For example, if the team cannot start work until everyone installs the latest version of Microsoft Visual Studio then the Scrum Master has the responsibility of working with management to get the latest version of Visual Studio as quickly as possible. The Scrum Master can be a member of the developer team. Furthermore, different people can take on the role of the Scrum Master over time. The Scrum Master, however, cannot be the same person as the Product Owner. Using SonicAgile SonicAgile (SonicAgile.com) is an online tool which you can use to manage your projects using Scrum. You can use the SonicAgile Product Backlog to create a prioritized list of stories. You can estimate the size of the Stories using different Story Point units such as Shirt Sizes and Coffee Cup sizes. You can use SonicAgile during the Sprint Planning meeting to select the Stories that you want to complete during a particular Sprint. You can configure Sprints to be any length of time. SonicAgile calculates Team Velocity automatically and displays a warning when you add too many stories to a Sprint. In other words, it warns you when it thinks you are overcommitting in a Sprint. SonicAgile also includes a Scrumboard which displays the list of Stories selected for a Sprint and the tasks associated with each story. You can drag tasks from one task state to another. Finally, SonicAgile enables you to generate Release Burndown and Sprint Burndown charts. You can use these charts to view the progress of your team. To learn more about SonicAgile, visit SonicAgile.com. Summary In this post, I described many of the basic concepts of Scrum. You learned how a Product Owner uses a Product Backlog to create a prioritized list of tasks. I explained why work is completed in Sprints so the developer team can be more productive. I also explained how a developer team uses the daily scrum to coordinate their work. You learned how the developer team uses a Scrumboard to see, at a glance, who is working on what and the state of each task. I also discussed Burndown charts. You learned how you can use both Release and Sprint Burndown charts to track team progress in completing a project. Finally, I described the crucial role of the Scrum Master – the person who is responsible for ensuring that the rules of Scrum are being followed. My goal was not to describe all of the concepts of Scrum. This post was intended to be an introductory overview. For a comprehensive explanation of Scrum, I recommend reading Ken Schwaber’s book Agile Project Management with Scrum: http://www.amazon.com/Agile-Project-Management-Microsoft-Professional/dp/073561993X/ref=la_B001H6ODMC_1_1?ie=UTF8&qid=1345224000&sr=1-1

    Read the article

  • The curious case of SOA Human tasks' automatic completion

    - by Kavitha Srinivasan
    A large south-Asian insurance industry customer using Oracle BPM and SOA ran into this. I have survived this ordeal previously myself but didnt think to blog it then. However, it seems like a good idea to share this knowledge with this reader community and so here goes.. Symptom: A human task (in a SOA/BPEL/BPM process) completes automatically while it should have been assigned to a proper user.There are no stack traces, no related exceptions in the logs. Why: The product is designed to treat human tasks that don't have assignees as one that is eligible for completion. And hence no warning/error messages are recorded in the logs. Usecase variant: A variant of this usecase, where an assignee doesnt exist in the repository is treated as a recoverable error. One can find this in the 'pending recovery' instances in EM and reactivate the task by changing the assignees in the bpm workspace as a process owner /administrator. But back to the usecase when tasks get completed automatically... When: This happens when the users/groups assigned to a task are 'empty' or null. This has been seen only on tasks whose assignees are derived from an assignment expression - ie at runtime an XPath is used to determine who to assign the task to. (This should not happen if task assignees are populated via swim-lane roles.) How to detect this in EM For instances that are auto-completed thus, one will notice in the Audit Trail of such instances, that the 'outcome' of the task is empty. The 'acquired by' element will also show as empty/null. Enabling the oracle.soa.services.workflow.* logger in em should print more verbose messages about this. How to fix this The application code needs two fixes: input to HT: The XSLT/XPath used  to set the task 'assignee' and the process itself should be enhanced to handle nulls better. For eg: if no-data-found, set assignees to alternate value, force default assignees etc. output from HT: Additionally, in the application code, check that the 'outcome' of the HT is not-null. If null, route the task to be performed again after setting the assignee correctly. Beginning PS4FP, one should be able to use 'grab' to route back to the task to fire again. Hope this helps. 

    Read the article

  • MSSQL: Copying data from one database to another

    - by DigiMortal
    I have database that has data imported from another server using import and export wizard of SQL Server Management Studio. There is also empty database with same tables but it also has primary keys, foreign keys and indexes. How to get data from first database to another? Here is the description of my crusade. And believe me – it is not nice one. Bugs in import and export wizard There is some awful bugs in import and export wizard that makes data imports and exports possible only on very limited manner: wizard is not able to analyze foreign keys, wizard wants to create tables always, whatever you say in settings. The result is faulty and useless package. Now let’s go step by step and make things work in our scenario. Database There are two databases. Let’s name them like this: PLAIN – contains data imported from remote server (no indexes, no keys, no nothing, just plain dumb data) CORRECT – empty database with same structure as remote database (indexes, keys and everything else but no data) Our goal is to get data from PLAIN to CORRECT. 1. Create import and export package In this point we will create faulty SSIS package using SQL Server Management Studio. Run import and export wizard and let it create SSIS package that reads data from CORRECT and writes it to, let’s say, CORRECT-2. Make sure you enable identity insert. Make sure there are no views selected. Make sure you don’t let package to create tables (you can miss this step because it wants to create tables anyway). Save package to SSIS. 2. Modify import and export package Now let’s clean up the package and remove all faulty crap. Connect SQL Server Management Studio to SSIS instance. Select the package you just saved and export it to your hard disc. Run Business Intelligence Studio. Create new SSIS project (DON’T MISS THIS STEP). Add package from disc as existing item to project and open it. Move to Control Flow page do one of following: Remove all preparation SQL-tasks and connect Data Flow tasks. Modify all preparation SQL-tasks so the existence of tables is checked before table is created (yes, you have to do it manually). Add new Execute-SQL task as first task in control flow: Open task properties. Assign destination connection as connection to use. Insert the following SQL as command:   EXEC sp_MSForEachTable 'ALTER TABLE ? NOCHECK CONSTRAINT ALL' GO   EXEC sp_MSForEachTable 'DELETE FROM ?' GO   Save task. Add new Execute-SQL task as last task in control flow: Open task properties. Assign destination connection as connection to use. Insert the following SQL as command:   EXEC sp_MSForEachTable 'ALTER TABLE ? CHECK CONSTRAINT ALL' GO   Save task Now connect first Execute-SQL task with first Data Flow task and last Data Flow task with second Execute-SQL task. Now move to Package Explorer tab and change connections under Connection Managers folder. Make source connection to use database PLAIN. Make destination connection to use database CORRECT. Save package and rebuilt the project. Update package using SQL Server Management Studio. Some hints: Make sure you take the package from solution folder because it is saved there now. Don’t overwrite existing package. Use numeric suffix and let Management Studio to create a new version of package. Now you are done with your package. Run it to test it and clean out all the errors you find. TRUNCATE vs DELETE You can see that I used DELETE FROM instead of TRUNCATE. Why? Because TRUNCATE has some nasty limits (taken from MSDN): “You cannot use TRUNCATE TABLE on a table referenced by a FOREIGN KEY constraint; instead, use DELETE statement without a WHERE clause. Because TRUNCATE TABLE is not logged, it cannot activate a trigger. TRUNCATE TABLE may not be used on tables participating in an indexed view.” As I am not sure what tables you have and how they are used I provided here the solution that should work for all scenarios. If you need better performance then in some cases you can use TRUNCATE table instead of DELETE. Conclusion My conclusion is bitter this time although I am very positive guy. It is A.D. 2010 and still we have to write stupid hacks for simple things. Simple tools that existed before are long gone and we have to live mysterious bloatware that is our only choice when using default tools. If you take a look at the length of this posting and the count of steps I had to do for one easy thing you should treat it as a signal that something has went wrong in last years. Although I got my job done I would be still more happy if out of box tools are more intelligent one day. References T-SQL Trick for Deleting All Data in Your Database (Mauro Cardarelli) TRUNCATE TABLE (MSDN Library) Error Handling in SQL 2000 – a Background (Erland Sommarskog) Disable/Enable Foreign Key and Check constraints in SQL Server (Decipher)

    Read the article

  • Running Teamsite User Admin tool IWUSERADM.exe from ASP.NET

    - by Narendra Tiwari
    It has really been a head scratching task for me. I 've tried many options but nothing worked. Finally I found a workaround on google to achive this by TaskScheduler. PROBLEM When we run Teamsite user administration command line tool IWUSERADM.exe though ASP.Net it gives following error: Application popup: cmd.exe - Application Error : The application failed to initialize properly (0xc0000142). Click on OK to terminate the application. CAUSE No specific cause, it seems to be a bug, supposed to be resolved with this Microsoft patch http://support.microsoft.com/kb/960266. and there is nothing related to permission issue, y web application is impersonated with an administrator account. off course running a bat file from dmin account is a potential secury threat but for this scenario lets conifned our discussion to run the command line tool. RESOLUTION I have not tried this patch as I have not permitted to run this patch on server. Below are the steps to achive the requirement. 1/ Create a batch file which runs the IWUSERADM.exe.         echo - Add Teamsite User    CD E:\Appli\GN00\iw-home\bin    iwuseradm add-user %1 2/ Temporarily create a schedule task and run  the .bat file by scheduled task by ASP.Net code using TaskScheduler http://www.codeproject.com/KB/cs/tsnewlib.aspx. 3/ Here is the function: private int AddTeamsiteUser(string strBatchFilePath, string strUser) { //Get a ScheduledTasks object for the local computer. ScheduledTasks st = new ScheduledTasks(); // Create a task Task t; try{ t = st.CreateTask("~AddTeamsiteUser"); } catch { throw new Exception("Schedule Task ~AddTeamsiteUser already exist."); }    t.SetAccountInformation(yourLogin, yourPassword); //Set the account under which the task should run.  t.Save();  t.Run(); Thread.Sleep(2000); //for sync issue //Remove the scheduled task st.DeleteTask("~AddTeamsiteUser"); return t.ExitCode;   Below are few resources related to the above scenario:- - Task Scheduler Class Library for .NET  http://www.codeproject.com/KB/cs/tsnewlib.aspx - Run a .BAT file from ASP.NET  http://codebetter.com/blogs/brendan.tompkins/archive/2004/05/13/13484.aspx - TaskScheduler Class  http://msdn.microsoft.com/en-us/library/system.threading.tasks.taskscheduler.aspx - Application Hangs whle running iwuseradm.exe through ASP.Net  http://bytes.com/topic/asp-net/answers/733098-system-diagnostics-process-hangs     t.ApplicationName = strBatchFilePath; t.Parameters = strUser; t.Comment = "Adding user to Teamsite Application"

    Read the article

  • How to do end task similar to that in Windows?

    - by Rohit Bansal
    Sometimes Ubuntu freezes and I have no other option than to power off system directly. Is there some remedy or way like 'Ctrl + Alt + Del' similar to that in Windows. That would be very helpful..... I need a way out other than option to directly power off my system in times of gross failure. Shutting down this way always creates fear crashing down my system or losing some data which is unacceptable.

    Read the article

  • In this context with views in a tree, which class should perform the task?

    - by Jhonny 8
    Imagine that I have this context: A main view containing a table containing some cells. Each one of them with their own controller and view files. In the main view, I have an object "Person", with 3 different IDs. Depending on certain conditions (let say, time of the day), I have to choose one of them and display it in the cell. My question is, should the main view pass the whole object to the table, and this one to the cell, and the cell will calculate the ID that it will be shown? or, The main view calculates this parameter, and send the result to the table and this to the cell? Is a question focused on OO design, which one of this approaches is more suitable in an OO design and why?

    Read the article

  • Why is purchasing Microsoft licences such a daunting task? [closed]

    - by John Nevermore
    I've spent 2 frustrating days jumping through hoops and browsing through different local e-shops for VS (Visual Studio) 2010 Pro. And WHS (Windows Home Server) FPP 2011 licenses. I found jack .. - or to be more precise, the closest I found in my country was WHS OEM 2011 licenses after multiple emails sent to individuals found on Microsoft partners page. Question being, why is it so difficult to get your hands on Microsoft licenses as an individual? Sure, you can get the latest end user operating systems from most shops, but when it comes to development tools or server software you are left dry. And companies that do sell licenses most of the time don't even put up pricing or a self service environment for buying the licenses, you need to have an hawk's eye for that shiny little Microsoft partner logo and spam through bunch of emails not knowing, if you can count on them to get the license or not. Sure, i could whip out my credit card and buy the VS 2010 license on the online Microsoft Shop. Well whippideegoddamndoo, they sell that, but they don't sell WHS 11 licenses. Why does a company make it so hard to buy their products? Let's not even talk about the licensing itself being a pain.

    Read the article

  • The answer to the unfathomable question: what is meaning of error value 2147943645?

    - by Jim Lahman
    I scheduled a task to perform a windows backup of a single disk on the my server.  When I tested it, the task ran successfully – no problems, no errors; just as I expected.    However, when the task ran as scheduled, it failed with error value 2147943645.  I wondered was this the answer to life, the universe and everything in it?  No.  That is 42.    After doing some research and reviewing the task configuration, I realize that the task will only run if the user of logged on:   So, this was the answer!!  I have to configure the task to run whether the user is logged or not.  Or, else I’ll get that nasty error value.

    Read the article

  • How to deal with fellow programmer who likes to delegate task with lack any support from boss [closed]

    - by Rudy
    I have a problem with my fellow programmer. We are currently working together in a small project that need to be shipped every 2 weeks. She has a tendency to ask for help for every issues that she is facing. Whether it's a compile error, algorithm problem or even sync/merge issue that caused by herself. She does not even bother to check Google or try to find out by herself. I can be asked to help her for 5-10 times a day. Everyday her husband keeps calling (4-6 times a day), and most of the code that has been delivered by her are actually incorrect. Today she framed me for sending the wrong delivery product. She went home after lunch on the delivery day without telling PM and other team member on that day and her code she commited does not work at all. It's not even tested. I have no choice to roll back her code and cleaning her code just for sake to able to run the product. I have warned her about her defective codes for almost 3 iterations. She said when she was not around I should be able to test her module for her. I snapped and yelled that I am not her slave and directly reported to my boss. However, my boss is not a person that can manage and care about software quality. What is the most important thing to my boss is delivery of product, whether it tested or not. He can even asked us to deliver something that not even tested by QA to the client, on the next day. Most of our suggestion is not followed by him. He even asked me to apologize to her because I snapped. I am tired of the whole situation. This kind of thing keeps repeated. I do have saving to be able to survive for 6 months and the idea of resigning is keep haunting. There is nothing else that can be learned in my current job and I had been in a better environment than this. What should I do with the situation?

    Read the article

  • what is the task of a coach in acm programming contests?

    - by Layla
    In the university that I am working they have decided to participate in the ACM regionals for the first time, they would like to appoint me like a coach. I have never been into that situation before and have not found so much information about it, so what is the real work of a coach in those contests? Sometimes I have found experienced programmers like coaches, but others are just people with no so good programming skills; so what is all about?

    Read the article

  • Is this a violation of the Liskov Substitution Principle?

    - by Paul T Davies
    Say we have a list of Task entities, and a ProjectTask sub type. Tasks can be closed at any time, except ProjectTasks which cannot be closed once they have a status of Started. The UI should ensure the option to close a started ProjectTask is never available, but some safeguards are present in the domain: public class Task { public Status Status { get; set; } public virtual void Close() { Status = Status.Closed; } } public ProjectTask : Task { public override void Close() { if (Status == Status.Started) throw new Exception("Cannot close a started Project Task"); base.Close(); } } Now when calling Close() on a Task, there is a chance the call will fail if it is a ProjectTask with the started status, when it wouldn't if it was a base Task. But this is the business requirements. It should fail. Can this be regarded as a violation?

    Read the article

  • More than one unique key for HashMap problem (Java)

    - by Alex Cheng
    This question is a continuation of this thread: In short: To solve my problem, I want to use Map<Set<String>, String>. However, after I sort my data entries in Excel, remove the unnecessary parameters, and the following came out: flow content ==> content content flow content ==> content depth distance flow content ==> content depth within flow content ==> content depth within distance flow content ==> content within flow content ==> content within distance I have more than one unique key for the hashmap if that is the case. How do I go around this... anyone have any idea? I was thinking of maybe Map<Set <String>, List <String>> so that I can do something like: Set <flow content>, List <'content content','content depth distance','content depth within ', ..., 'content within distance'> But because I am parsing the entries line by line I can't figure out the way how to store values of the same repeated keys (flow content) into the same list and add it to the map. Anyone have a rough logic on how can this be done in Java? Thanks in advance.

    Read the article

< Previous Page | 45 46 47 48 49 50 51 52 53 54 55 56  | Next Page >