Search Results

Search found 8816 results on 353 pages for 'upcoming events'.

Page 121/353 | < Previous Page | 117 118 119 120 121 122 123 124 125 126 127 128  | Next Page >

  • Could I be going crazy with Event Handlers? Am I going the "wrong way" with my design?

    - by sensae
    I guess I've decided that I really like event handlers. I may be suffering a bit from analysis paralysis, but I'm concerned about making my design unwieldy or running into some other unforeseen consequence to my design decisions. My game engine currently does basic sprite-based rendering with a panning overhead camera. My design looks a bit like this: SceneHandler Contains a list of classes that implement the SceneListener interface (currently only Sprites). Calls render() once per tick, and sends onCameraUpdate(); messages to SceneListeners. InputHandler Polls the input once per tick, and sends a simple "onKeyPressed" message to InputListeners. I have a Camera InputListener which holds a SceneHandler instance and triggers updateCamera(); events based on what the input is. AgentHandler Calls default actions on any Agents (AI) once per tick, and will check a stack for any new events that are registered, dispatching them to specific Agents as needed. So I have basic sprite objects that can move around a scene and use rudimentary steering behaviors to travel. I've gotten onto collision detection, and this is where I'm not sure the direction my design is going is good. Is it a good practice to have many, small event handlers? I imagine going the way I am that I'd have to implement some kind of CollisionHandler. Would I be better off with a more consolidated EntityHandler which handles AI, collision updates, and other entity interactions in one class? Or will I be fine just implementing many different event handling subsystems which pass messages to each other based on what kind of event it is? Should I write an EntityHandler which is simply responsible for coordinating all these sub event handlers? I realize in some cases, such as my InputHandler and SceneHandler, those are very specific types of events. A large portion of my game code won't care about input, and a large portion won't care about updates that happen purely in the rendering of the scene. Thus I feel my isolation of those systems is justified. However, I'm asking this question specifically approaching game logic type events.

    Read the article

  • Sampling SQL server batch activity

    - by extended_events
    Recently I was troubleshooting a performance issue on an internal tracking workload and needed to collect some very low level events over a period of 3-4 hours.  During analysis of the data I found that a common pattern I was using was to find a batch with a duration that was longer than average and follow all the events it produced.  This pattern got me thinking that I was discarding a substantial amount of event data that had been collected, and that it would be great to be able to reduce the collection overhead on the server if I could still get all activity from some batches. In the past I’ve used a sampling technique based on the counter predicate to build a baseline of overall activity (see Mikes post here).  This isn’t exactly what I want though as there would certainly be events from a particular batch that wouldn’t pass the predicate.  What I need is a way to identify streams of work and select say one in ten of them to watch, and sql server provides just such a mechanism: session_id.  Session_id is a server assigned integer that is bound to a connection at login and lasts until logout.  So by combining the session_id predicate source and the divides_by_uint64 predicate comparator we can limit collection, and still get all the events in batches for investigation. CREATE EVENT SESSION session_10_percent ON SERVER ADD EVENT sqlserver.sql_statement_starting(     WHERE (package0.divides_by_uint64(sqlserver.session_id,10))), ADD EVENT sqlos.wait_info (        WHERE (package0.divides_by_uint64(sqlserver.session_id,10))), ADD EVENT sqlos.wait_info_external (        WHERE (package0.divides_by_uint64(sqlserver.session_id,10))), ADD EVENT sqlserver.sql_statement_completed(     WHERE (package0.divides_by_uint64(sqlserver.session_id,10))) ADD TARGET ring_buffer WITH (MAX_DISPATCH_LATENCY=30 SECONDS,TRACK_CAUSALITY=ON) GO   There we go; event collection is reduced while still providing enough information to find the root of the problem.  By the way the performance issue turned out to be an IO issue, and the session definition above was more than enough to show long waits on PAGEIOLATCH*.        

    Read the article

  • Create a Monitoring Server for SQL Server with PowerShell

    At some point you are going to need a notification system for a range of events that occur in your servers. Laerte Junior shows how you can even set up temporary or permanent alerts for any WMI events to give you a system that fits your server environment perfectly. Get Smart with SQL Backup Pro Powerful centralised management, encryption and more.SQL Backup Pro was the smartest kid at school Discover why.

    Read the article

  • Do you know a good web CMS to manage a sports team?

    - by benjamin'''''42
    I'm looking for a web based CMS that enables me to manage a sport team, I need the following features: Calendar** Schedule events (sync with the calendar, RSS feed), it would be great if I could schedule a weekly event too, so that I don't have to schedule it by hand each week** Announcements (same RSS feed as events)** A place where I can put some documentation, rules** Keep track of the matches and scores Photo and video gallery ** means feature is required; otherwise optional Any technology for the CMS is probably fine, though I would prefer an SQLite-based CMS.

    Read the article

  • New SQL Monitor Custom Metric: Database Autogrowth

    This metric for Red Gate SQL Monitor measures the number of database autogrowth events (data file or log file) in the last hour. Too many autogrowth events causes disk fragmentation which requires a change in the autogrowth settings of a database. ‘Disturbing Development’Grant Fritchey & the DBA Team present the latest installment of the Top 5 hard-earned lessons of a DBA – read it now

    Read the article

  • Silverlight Recruiting Application Part 5 - Jobs Module / View

    Now we starting getting into a more code-heavy portion of this series, thankfully though this means the groundwork is all set for the most part and after adding the modules we will have a complete application that can be provided with full source. The Jobs module will have two concerns- adding and maintaining jobs that can then be broadcast out to the website. How they are displayed on the site will be handled by our admin system (which will just poll from this common database), so we aren't too concerned with that, but rather with getting the information into the system and allowing the backend administration/HR users to keep things up to date. Since there is a fair bit of information that we want to display, we're going to move editing to a separate view so we can get all that information in an easy-to-use spot. With all the files created for this module, the project looks something like this: And now... on to the code. XAML for the Job Posting View All we really need for the Job Posting View is a RadGridView and a few buttons. This will let us both show off records and perform operations on the records without much hassle. That XAML is going to look something like this: 01.<Grid x:Name="LayoutRoot" 02.Background="White"> 03.<Grid.RowDefinitions> 04.<RowDefinition Height="30" /> 05.<RowDefinition /> 06.</Grid.RowDefinitions> 07.<StackPanel Orientation="Horizontal"> 08.<Button x:Name="xAddRecordButton" 09.Content="Add Job" 10.Width="120" 11.cal:Click.Command="{Binding AddRecord}" 12.telerik:StyleManager.Theme="Windows7" /> 13.<Button x:Name="xEditRecordButton" 14.Content="Edit Job" 15.Width="120" 16.cal:Click.Command="{Binding EditRecord}" 17.telerik:StyleManager.Theme="Windows7" /> 18.</StackPanel> 19.<telerikGrid:RadGridView x:Name="xJobsGrid" 20.Grid.Row="1" 21.IsReadOnly="True" 22.AutoGenerateColumns="False" 23.ColumnWidth="*" 24.RowDetailsVisibilityMode="VisibleWhenSelected" 25.ItemsSource="{Binding MyJobs}" 26.SelectedItem="{Binding SelectedJob, Mode=TwoWay}" 27.command:SelectedItemChangedEventClass.Command="{Binding SelectedItemChanged}"> 28.<telerikGrid:RadGridView.Columns> 29.<telerikGrid:GridViewDataColumn Header="Job Title" 30.DataMemberBinding="{Binding JobTitle}" 31.UniqueName="JobTitle" /> 32.<telerikGrid:GridViewDataColumn Header="Location" 33.DataMemberBinding="{Binding Location}" 34.UniqueName="Location" /> 35.<telerikGrid:GridViewDataColumn Header="Resume Required" 36.DataMemberBinding="{Binding NeedsResume}" 37.UniqueName="NeedsResume" /> 38.<telerikGrid:GridViewDataColumn Header="CV Required" 39.DataMemberBinding="{Binding NeedsCV}" 40.UniqueName="NeedsCV" /> 41.<telerikGrid:GridViewDataColumn Header="Overview Required" 42.DataMemberBinding="{Binding NeedsOverview}" 43.UniqueName="NeedsOverview" /> 44.<telerikGrid:GridViewDataColumn Header="Active" 45.DataMemberBinding="{Binding IsActive}" 46.UniqueName="IsActive" /> 47.</telerikGrid:RadGridView.Columns> 48.</telerikGrid:RadGridView> 49.</Grid> I'll explain what's happening here by line numbers: Lines 11 and 16: Using the same type of click commands as we saw in the Menu module, we tie the button clicks to delegate commands in the viewmodel. Line 25: The source for the jobs will be a collection in the viewmodel. Line 26: We also bind the selected item to a public property from the viewmodel for use in code. Line 27: We've turned the event into a command so we can handle it via code in the viewmodel. So those first three probably make sense to you as far as Silverlight/WPF binding magic is concerned, but for line 27... This actually comes from something I read onDamien Schenkelman's blog back in the day for creating an attached behavior from any event. So, any time you see me using command:Whatever.Command, the backing for it is actually something like this: SelectedItemChangedEventBehavior.cs: 01.public class SelectedItemChangedEventBehavior : CommandBehaviorBase<Telerik.Windows.Controls.DataControl> 02.{ 03.public SelectedItemChangedEventBehavior(DataControl element) 04.: base(element) 05.{ 06.element.SelectionChanged += new EventHandler<SelectionChangeEventArgs>(element_SelectionChanged); 07.} 08.void element_SelectionChanged(object sender, SelectionChangeEventArgs e) 09.{ 10.// We'll only ever allow single selection, so will only need item index 0 11.base.CommandParameter = e.AddedItems[0]; 12.base.ExecuteCommand(); 13.} 14.} SelectedItemChangedEventClass.cs: 01.public class SelectedItemChangedEventClass 02.{ 03.#region The Command Stuff 04.public static ICommand GetCommand(DependencyObject obj) 05.{ 06.return (ICommand)obj.GetValue(CommandProperty); 07.} 08.public static void SetCommand(DependencyObject obj, ICommand value) 09.{ 10.obj.SetValue(CommandProperty, value); 11.} 12.public static readonly DependencyProperty CommandProperty = 13.DependencyProperty.RegisterAttached("Command", typeof(ICommand), 14.typeof(SelectedItemChangedEventClass), new PropertyMetadata(OnSetCommandCallback)); 15.public static void OnSetCommandCallback(DependencyObject dependencyObject, DependencyPropertyChangedEventArgs e) 16.{ 17.DataControl element = dependencyObject as DataControl; 18.if (element != null) 19.{ 20.SelectedItemChangedEventBehavior behavior = GetOrCreateBehavior(element); 21.behavior.Command = e.NewValue as ICommand; 22.} 23.} 24.#endregion 25.public static SelectedItemChangedEventBehavior GetOrCreateBehavior(DataControl element) 26.{ 27.SelectedItemChangedEventBehavior behavior = element.GetValue(SelectedItemChangedEventBehaviorProperty) as SelectedItemChangedEventBehavior; 28.if (behavior == null) 29.{ 30.behavior = new SelectedItemChangedEventBehavior(element); 31.element.SetValue(SelectedItemChangedEventBehaviorProperty, behavior); 32.} 33.return behavior; 34.} 35.public static SelectedItemChangedEventBehavior GetSelectedItemChangedEventBehavior(DependencyObject obj) 36.{ 37.return (SelectedItemChangedEventBehavior)obj.GetValue(SelectedItemChangedEventBehaviorProperty); 38.} 39.public static void SetSelectedItemChangedEventBehavior(DependencyObject obj, SelectedItemChangedEventBehavior value) 40.{ 41.obj.SetValue(SelectedItemChangedEventBehaviorProperty, value); 42.} 43.public static readonly DependencyProperty SelectedItemChangedEventBehaviorProperty = 44.DependencyProperty.RegisterAttached("SelectedItemChangedEventBehavior", 45.typeof(SelectedItemChangedEventBehavior), typeof(SelectedItemChangedEventClass), null); 46.} These end up looking very similar from command to command, but in a nutshell you create a command based on any event, determine what the parameter for it will be, then execute. It attaches via XAML and ties to a DelegateCommand in the viewmodel, so you get the full event experience (since some controls get a bit event-rich for added functionality). Simple enough, right? Viewmodel for the Job Posting View The Viewmodel is going to need to handle all events going back and forth, maintaining interactions with the data we are using, and both publishing and subscribing to events. Rather than breaking this into tons of little pieces, I'll give you a nice view of the entire viewmodel and then hit up the important points line-by-line: 001.public class JobPostingViewModel : ViewModelBase 002.{ 003.private readonly IEventAggregator eventAggregator; 004.private readonly IRegionManager regionManager; 005.public DelegateCommand<object> AddRecord { get; set; } 006.public DelegateCommand<object> EditRecord { get; set; } 007.public DelegateCommand<object> SelectedItemChanged { get; set; } 008.public RecruitingContext context; 009.private QueryableCollectionView _myJobs; 010.public QueryableCollectionView MyJobs 011.{ 012.get { return _myJobs; } 013.} 014.private QueryableCollectionView _selectionJobActionHistory; 015.public QueryableCollectionView SelectedJobActionHistory 016.{ 017.get { return _selectionJobActionHistory; } 018.} 019.private JobPosting _selectedJob; 020.public JobPosting SelectedJob 021.{ 022.get { return _selectedJob; } 023.set 024.{ 025.if (value != _selectedJob) 026.{ 027._selectedJob = value; 028.NotifyChanged("SelectedJob"); 029.} 030.} 031.} 032.public SubscriptionToken editToken = new SubscriptionToken(); 033.public SubscriptionToken addToken = new SubscriptionToken(); 034.public JobPostingViewModel(IEventAggregator eventAgg, IRegionManager regionmanager) 035.{ 036.// set Unity items 037.this.eventAggregator = eventAgg; 038.this.regionManager = regionmanager; 039.// load our context 040.context = new RecruitingContext(); 041.this._myJobs = new QueryableCollectionView(context.JobPostings); 042.context.Load(context.GetJobPostingsQuery()); 043.// set command events 044.this.AddRecord = new DelegateCommand<object>(this.AddNewRecord); 045.this.EditRecord = new DelegateCommand<object>(this.EditExistingRecord); 046.this.SelectedItemChanged = new DelegateCommand<object>(this.SelectedRecordChanged); 047.SetSubscriptions(); 048.} 049.#region DelegateCommands from View 050.public void AddNewRecord(object obj) 051.{ 052.this.eventAggregator.GetEvent<AddJobEvent>().Publish(true); 053.} 054.public void EditExistingRecord(object obj) 055.{ 056.if (_selectedJob == null) 057.{ 058.this.eventAggregator.GetEvent<NotifyUserEvent>().Publish("No job selected."); 059.} 060.else 061.{ 062.this._myJobs.EditItem(this._selectedJob); 063.this.eventAggregator.GetEvent<EditJobEvent>().Publish(this._selectedJob); 064.} 065.} 066.public void SelectedRecordChanged(object obj) 067.{ 068.if (obj.GetType() == typeof(ActionHistory)) 069.{ 070.// event bubbles up so we don't catch items from the ActionHistory grid 071.} 072.else 073.{ 074.JobPosting job = obj as JobPosting; 075.GrabHistory(job.PostingID); 076.} 077.} 078.#endregion 079.#region Subscription Declaration and Events 080.public void SetSubscriptions() 081.{ 082.EditJobCompleteEvent editComplete = eventAggregator.GetEvent<EditJobCompleteEvent>(); 083.if (editToken != null) 084.editComplete.Unsubscribe(editToken); 085.editToken = editComplete.Subscribe(this.EditCompleteEventHandler); 086.AddJobCompleteEvent addComplete = eventAggregator.GetEvent<AddJobCompleteEvent>(); 087.if (addToken != null) 088.addComplete.Unsubscribe(addToken); 089.addToken = addComplete.Subscribe(this.AddCompleteEventHandler); 090.} 091.public void EditCompleteEventHandler(bool complete) 092.{ 093.if (complete) 094.{ 095.JobPosting thisJob = _myJobs.CurrentEditItem as JobPosting; 096.this._myJobs.CommitEdit(); 097.this.context.SubmitChanges((s) => 098.{ 099.ActionHistory myAction = new ActionHistory(); 100.myAction.PostingID = thisJob.PostingID; 101.myAction.Description = String.Format("Job '{0}' has been edited by {1}", thisJob.JobTitle, "default user"); 102.myAction.TimeStamp = DateTime.Now; 103.eventAggregator.GetEvent<AddActionEvent>().Publish(myAction); 104.} 105., null); 106.} 107.else 108.{ 109.this._myJobs.CancelEdit(); 110.} 111.this.MakeMeActive(this.regionManager, "MainRegion", "JobPostingsView"); 112.} 113.public void AddCompleteEventHandler(JobPosting job) 114.{ 115.if (job == null) 116.{ 117.// do nothing, new job add cancelled 118.} 119.else 120.{ 121.this.context.JobPostings.Add(job); 122.this.context.SubmitChanges((s) => 123.{ 124.ActionHistory myAction = new ActionHistory(); 125.myAction.PostingID = job.PostingID; 126.myAction.Description = String.Format("Job '{0}' has been added by {1}", job.JobTitle, "default user"); 127.myAction.TimeStamp = DateTime.Now; 128.eventAggregator.GetEvent<AddActionEvent>().Publish(myAction); 129.} 130., null); 131.} 132.this.MakeMeActive(this.regionManager, "MainRegion", "JobPostingsView"); 133.} 134.#endregion 135.public void GrabHistory(int postID) 136.{ 137.context.ActionHistories.Clear(); 138._selectionJobActionHistory = new QueryableCollectionView(context.ActionHistories); 139.context.Load(context.GetHistoryForJobQuery(postID)); 140.} Taking it from the top, we're injecting an Event Aggregator and Region Manager for use down the road and also have the public DelegateCommands (just like in the Menu module). We also grab a reference to our context, which we'll obviously need for data, then set up a few fields with public properties tied to them. We're also setting subscription tokens, which we have not yet seen but I will get into below. The AddNewRecord (50) and EditExistingRecord (54) methods should speak for themselves for functionality, the one thing of note is we're sending events off to the Event Aggregator which some module, somewhere will take care of. Since these aren't entirely relying on one another, the Jobs View doesn't care if anyone is listening, but it will publish AddJobEvent (52), NotifyUserEvent (58) and EditJobEvent (63)regardless. Don't mind the GrabHistory() method so much, that is just grabbing history items (visibly being created in the SubmitChanges callbacks), and adding them to the database. Every action will trigger a history event, so we'll know who modified what and when, just in case. ;) So where are we at? Well, if we click to Add a job, we publish an event, if we edit a job, we publish an event with the selected record (attained through the magic of binding). Where is this all going though? To the Viewmodel, of course! XAML for the AddEditJobView This is pretty straightforward except for one thing, noted below: 001.<Grid x:Name="LayoutRoot" 002.Background="White"> 003.<Grid x:Name="xEditGrid" 004.Margin="10" 005.validationHelper:ValidationScope.Errors="{Binding Errors}"> 006.<Grid.Background> 007.<LinearGradientBrush EndPoint="0.5,1" 008.StartPoint="0.5,0"> 009.<GradientStop Color="#FFC7C7C7" 010.Offset="0" /> 011.<GradientStop Color="#FFF6F3F3" 012.Offset="1" /> 013.</LinearGradientBrush> 014.</Grid.Background> 015.<Grid.RowDefinitions> 016.<RowDefinition Height="40" /> 017.<RowDefinition Height="40" /> 018.<RowDefinition Height="40" /> 019.<RowDefinition Height="100" /> 020.<RowDefinition Height="100" /> 021.<RowDefinition Height="100" /> 022.<RowDefinition Height="40" /> 023.<RowDefinition Height="40" /> 024.<RowDefinition Height="40" /> 025.</Grid.RowDefinitions> 026.<Grid.ColumnDefinitions> 027.<ColumnDefinition Width="150" /> 028.<ColumnDefinition Width="150" /> 029.<ColumnDefinition Width="300" /> 030.<ColumnDefinition Width="100" /> 031.</Grid.ColumnDefinitions> 032.<!-- Title --> 033.<TextBlock Margin="8" 034.Text="{Binding AddEditString}" 035.TextWrapping="Wrap" 036.Grid.Column="1" 037.Grid.ColumnSpan="2" 038.FontSize="16" /> 039.<!-- Data entry area--> 040. 041.<TextBlock Margin="8,0,0,0" 042.Style="{StaticResource LabelTxb}" 043.Grid.Row="1" 044.Text="Job Title" 045.VerticalAlignment="Center" /> 046.<TextBox x:Name="xJobTitleTB" 047.Margin="0,8" 048.Grid.Column="1" 049.Grid.Row="1" 050.Text="{Binding activeJob.JobTitle, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 051.Grid.ColumnSpan="2" /> 052.<TextBlock Margin="8,0,0,0" 053.Grid.Row="2" 054.Text="Location" 055.d:LayoutOverrides="Height" 056.VerticalAlignment="Center" /> 057.<TextBox x:Name="xLocationTB" 058.Margin="0,8" 059.Grid.Column="1" 060.Grid.Row="2" 061.Text="{Binding activeJob.Location, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 062.Grid.ColumnSpan="2" /> 063. 064.<TextBlock Margin="8,11,8,0" 065.Grid.Row="3" 066.Text="Description" 067.TextWrapping="Wrap" 068.VerticalAlignment="Top" /> 069. 070.<TextBox x:Name="xDescriptionTB" 071.Height="84" 072.TextWrapping="Wrap" 073.ScrollViewer.VerticalScrollBarVisibility="Auto" 074.Grid.Column="1" 075.Grid.Row="3" 076.Text="{Binding activeJob.Description, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 077.Grid.ColumnSpan="2" /> 078.<TextBlock Margin="8,11,8,0" 079.Grid.Row="4" 080.Text="Requirements" 081.TextWrapping="Wrap" 082.VerticalAlignment="Top" /> 083. 084.<TextBox x:Name="xRequirementsTB" 085.Height="84" 086.TextWrapping="Wrap" 087.ScrollViewer.VerticalScrollBarVisibility="Auto" 088.Grid.Column="1" 089.Grid.Row="4" 090.Text="{Binding activeJob.Requirements, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 091.Grid.ColumnSpan="2" /> 092.<TextBlock Margin="8,11,8,0" 093.Grid.Row="5" 094.Text="Qualifications" 095.TextWrapping="Wrap" 096.VerticalAlignment="Top" /> 097. 098.<TextBox x:Name="xQualificationsTB" 099.Height="84" 100.TextWrapping="Wrap" 101.ScrollViewer.VerticalScrollBarVisibility="Auto" 102.Grid.Column="1" 103.Grid.Row="5" 104.Text="{Binding activeJob.Qualifications, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 105.Grid.ColumnSpan="2" /> 106.<!-- Requirements Checkboxes--> 107. 108.<CheckBox x:Name="xResumeRequiredCB" Margin="8,8,8,15" 109.Content="Resume Required" 110.Grid.Row="6" 111.Grid.ColumnSpan="2" 112.IsChecked="{Binding activeJob.NeedsResume, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}"/> 113. 114.<CheckBox x:Name="xCoverletterRequiredCB" Margin="8,8,8,15" 115.Content="Cover Letter Required" 116.Grid.Column="2" 117.Grid.Row="6" 118.IsChecked="{Binding activeJob.NeedsCV, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}"/> 119. 120.<CheckBox x:Name="xOverviewRequiredCB" Margin="8,8,8,15" 121.Content="Overview Required" 122.Grid.Row="7" 123.Grid.ColumnSpan="2" 124.IsChecked="{Binding activeJob.NeedsOverview, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}"/> 125. 126.<CheckBox x:Name="xJobActiveCB" Margin="8,8,8,15" 127.Content="Job is Active" 128.Grid.Column="2" 129.Grid.Row="7" 130.IsChecked="{Binding activeJob.IsActive, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}"/> 131. 132.<!-- Buttons --> 133. 134.<Button x:Name="xAddEditButton" Margin="8,8,0,10" 135.Content="{Binding AddEditButtonString}" 136.cal:Click.Command="{Binding AddEditCommand}" 137.Grid.Column="2" 138.Grid.Row="8" 139.HorizontalAlignment="Left" 140.Width="125" 141.telerik:StyleManager.Theme="Windows7" /> 142. 143.<Button x:Name="xCancelButton" HorizontalAlignment="Right" 144.Content="Cancel" 145.cal:Click.Command="{Binding CancelCommand}" 146.Margin="0,8,8,10" 147.Width="125" 148.Grid.Column="2" 149.Grid.Row="8" 150.telerik:StyleManager.Theme="Windows7" /> 151.</Grid> 152.</Grid> The 'validationHelper:ValidationScope' line may seem odd. This is a handy little trick for catching current and would-be validation errors when working in this whole setup. This all comes from an approach found on theJoy Of Code blog, although it looks like the story for this will be changing slightly with new advances in SL4/WCF RIA Services, so this section can definitely get an overhaul a little down the road. The code is the fun part of all this, so let us see what's happening under the hood. Viewmodel for the AddEditJobView We are going to see some of the same things happening here, so I'll skip over the repeat info and get right to the good stuff: 001.public class AddEditJobViewModel : ViewModelBase 002.{ 003.private readonly IEventAggregator eventAggregator; 004.private readonly IRegionManager regionManager; 005. 006.public RecruitingContext context; 007. 008.private JobPosting _activeJob; 009.public JobPosting activeJob 010.{ 011.get { return _activeJob; } 012.set 013.{ 014.if (_activeJob != value) 015.{ 016._activeJob = value; 017.NotifyChanged("activeJob"); 018.} 019.} 020.} 021. 022.public bool isNewJob; 023. 024.private string _addEditString; 025.public string AddEditString 026.{ 027.get { return _addEditString; } 028.set 029.{ 030.if (_addEditString != value) 031.{ 032._addEditString = value; 033.NotifyChanged("AddEditString"); 034.} 035.} 036.} 037. 038.private string _addEditButtonString; 039.public string AddEditButtonString 040.{ 041.get { return _addEditButtonString; } 042.set 043.{ 044.if (_addEditButtonString != value) 045.{ 046._addEditButtonString = value; 047.NotifyChanged("AddEditButtonString"); 048.} 049.} 050.} 051. 052.public SubscriptionToken addJobToken = new SubscriptionToken(); 053.public SubscriptionToken editJobToken = new SubscriptionToken(); 054. 055.public DelegateCommand<object> AddEditCommand { get; set; } 056.public DelegateCommand<object> CancelCommand { get; set; } 057. 058.private ObservableCollection<ValidationError> _errors = new ObservableCollection<ValidationError>(); 059.public ObservableCollection<ValidationError> Errors 060.{ 061.get { return _errors; } 062.} 063. 064.private ObservableCollection<ValidationResult> _valResults = new ObservableCollection<ValidationResult>(); 065.public ObservableCollection<ValidationResult> ValResults 066.{ 067.get { return this._valResults; } 068.} 069. 070.public AddEditJobViewModel(IEventAggregator eventAgg, IRegionManager regionmanager) 071.{ 072.// set Unity items 073.this.eventAggregator = eventAgg; 074.this.regionManager = regionmanager; 075. 076.context = new RecruitingContext(); 077. 078.AddEditCommand = new DelegateCommand<object>(this.AddEditJobCommand); 079.CancelCommand = new DelegateCommand<object>(this.CancelAddEditCommand); 080. 081.SetSubscriptions(); 082.} 083. 084.#region Subscription Declaration and Events 085. 086.public void SetSubscriptions() 087.{ 088.AddJobEvent addJob = this.eventAggregator.GetEvent<AddJobEvent>(); 089. 090.if (addJobToken != null) 091.addJob.Unsubscribe(addJobToken); 092. 093.addJobToken = addJob.Subscribe(this.AddJobEventHandler); 094. 095.EditJobEvent editJob = this.eventAggregator.GetEvent<EditJobEvent>(); 096. 097.if (editJobToken != null) 098.editJob.Unsubscribe(editJobToken); 099. 100.editJobToken = editJob.Subscribe(this.EditJobEventHandler); 101.} 102. 103.public void AddJobEventHandler(bool isNew) 104.{ 105.this.activeJob = null; 106.this.activeJob = new JobPosting(); 107.this.activeJob.IsActive = true; // We assume that we want a new job to go up immediately 108.this.isNewJob = true; 109.this.AddEditString = "Add New Job Posting"; 110.this.AddEditButtonString = "Add Job"; 111. 112.MakeMeActive(this.regionManager, "MainRegion", "AddEditJobView"); 113.} 114. 115.public void EditJobEventHandler(JobPosting editJob) 116.{ 117.this.activeJob = null; 118.this.activeJob = editJob; 119.this.isNewJob = false; 120.this.AddEditString = "Edit Job Posting"; 121.this.AddEditButtonString = "Edit Job"; 122. 123.MakeMeActive(this.regionManager, "MainRegion", "AddEditJobView"); 124.} 125. 126.#endregion 127. 128.#region DelegateCommands from View 129. 130.public void AddEditJobCommand(object obj) 131.{ 132.if (this.Errors.Count > 0) 133.{ 134.List<string> errorMessages = new List<string>(); 135. 136.foreach (var valR in this.Errors) 137.{ 138.errorMessages.Add(valR.Exception.Message); 139.} 140. 141.this.eventAggregator.GetEvent<DisplayValidationErrorsEvent>().Publish(errorMessages); 142. 143.} 144.else if (!Validator.TryValidateObject(this.activeJob, new ValidationContext(this.activeJob, null, null), _valResults, true)) 145.{ 146.List<string> errorMessages = new List<string>(); 147. 148.foreach (var valR in this._valResults) 149.{ 150.errorMessages.Add(valR.ErrorMessage); 151.} 152. 153.this._valResults.Clear(); 154. 155.this.eventAggregator.GetEvent<DisplayValidationErrorsEvent>().Publish(errorMessages); 156.} 157.else 158.{ 159.if (this.isNewJob) 160.{ 161.this.eventAggregator.GetEvent<AddJobCompleteEvent>().Publish(this.activeJob); 162.} 163.else 164.{ 165.this.eventAggregator.GetEvent<EditJobCompleteEvent>().Publish(true); 166.} 167.} 168.} 169. 170.public void CancelAddEditCommand(object obj) 171.{ 172.if (this.isNewJob) 173.{ 174.this.eventAggregator.GetEvent<AddJobCompleteEvent>().Publish(null); 175.} 176.else 177.{ 178.this.eventAggregator.GetEvent<EditJobCompleteEvent>().Publish(false); 179.} 180.} 181. 182.#endregion 183.} 184.} We start seeing something new on line 103- the AddJobEventHandler will create a new job and set that to the activeJob item on the ViewModel. When this is all set, the view calls that familiar MakeMeActive method to activate itself. I made a bit of a management call on making views self-activate like this, but I figured it works for one reason. As I create this application, views may not exist that I have in mind, so after a view receives its 'ping' from being subscribed to an event, it prepares whatever it needs to do and then goes active. This way if I don't have 'edit' hooked up, I can click as the day is long on the main view and won't get lost in an empty region. Total personal preference here. :) Everything else should again be pretty straightforward, although I do a bit of validation checking in the AddEditJobCommand, which can either fire off an event back to the main view/viewmodel if everything is a success or sent a list of errors to our notification module, which pops open a RadWindow with the alerts if any exist. As a bonus side note, here's what my WCF RIA Services metadata looks like for handling all of the validation: private JobPostingMetadata() { } [StringLength(2500, ErrorMessage = "Description should be more than one and less than 2500 characters.", MinimumLength = 1)] [Required(ErrorMessage = "Description is required.")] public string Description; [Required(ErrorMessage="Active Status is Required")] public bool IsActive; [StringLength(100, ErrorMessage = "Posting title must be more than 3 but less than 100 characters.", MinimumLength = 3)] [Required(ErrorMessage = "Job Title is required.")] public bool JobTitle; [Required] public string Location; public bool NeedsCV; public bool NeedsOverview; public bool NeedsResume; public int PostingID; [Required(ErrorMessage="Qualifications are required.")] [StringLength(2500, ErrorMessage="Qualifications should be more than one and less than 2500 characters.", MinimumLength=1)] public string Qualifications; [StringLength(2500, ErrorMessage = "Requirements should be more than one and less than 2500 characters.", MinimumLength = 1)] [Required(ErrorMessage="Requirements are required.")] public string Requirements;   The RecruitCB Alternative See all that Xaml I pasted above? Those are now two pieces sitting in the JobsView.xaml file now. The only real difference is that the xEditGrid now sits in the same place as xJobsGrid, with visibility swapping out between the two for a quick switch. I also took out all the cal: and command: command references and replaced Button events with clicks and the Grid selection command replaced with a SelectedItemChanged event. Also, at the bottom of the xEditGrid after the last button, I add a ValidationSummary (with Visibility=Collapsed) to catch any errors that are popping up. Simple as can be, and leads to this being the single code-behind file: 001.public partial class JobsView : UserControl 002.{ 003.public RecruitingContext context; 004.public JobPosting activeJob; 005.public bool isNew; 006.private ObservableCollection<ValidationResult> _valResults = new ObservableCollection<ValidationResult>(); 007.public ObservableCollection<ValidationResult> ValResults 008.{ 009.get { return this._valResults; } 010.} 011.public JobsView() 012.{ 013.InitializeComponent(); 014.this.Loaded += new RoutedEventHandler(JobsView_Loaded); 015.} 016.void JobsView_Loaded(object sender, RoutedEventArgs e) 017.{ 018.context = new RecruitingContext(); 019.xJobsGrid.ItemsSource = context.JobPostings; 020.context.Load(context.GetJobPostingsQuery()); 021.} 022.private void xAddRecordButton_Click(object sender, RoutedEventArgs e) 023.{ 024.activeJob = new JobPosting(); 025.isNew = true; 026.xAddEditTitle.Text = "Add a Job Posting"; 027.xAddEditButton.Content = "Add"; 028.xEditGrid.DataContext = activeJob; 029.HideJobsGrid(); 030.} 031.private void xEditRecordButton_Click(object sender, RoutedEventArgs e) 032.{ 033.activeJob = xJobsGrid.SelectedItem as JobPosting; 034.isNew = false; 035.xAddEditTitle.Text = "Edit a Job Posting"; 036.xAddEditButton.Content = "Edit"; 037.xEditGrid.DataContext = activeJob; 038.HideJobsGrid(); 039.} 040.private void xAddEditButton_Click(object sender, RoutedEventArgs e) 041.{ 042.if (!Validator.TryValidateObject(this.activeJob, new ValidationContext(this.activeJob, null, null), _valResults, true)) 043.{ 044.List<string> errorMessages = new List<string>(); 045.foreach (var valR in this._valResults) 046.{ 047.errorMessages.Add(valR.ErrorMessage); 048.} 049.this._valResults.Clear(); 050.ShowErrors(errorMessages); 051.} 052.else if (xSummary.Errors.Count > 0) 053.{ 054.List<string> errorMessages = new List<string>(); 055.foreach (var err in xSummary.Errors) 056.{ 057.errorMessages.Add(err.Message); 058.} 059.ShowErrors(errorMessages); 060.} 061.else 062.{ 063.if (this.isNew) 064.{ 065.context.JobPostings.Add(activeJob); 066.context.SubmitChanges((s) => 067.{ 068.ActionHistory thisAction = new ActionHistory(); 069.thisAction.PostingID = activeJob.PostingID; 070.thisAction.Description = String.Format("Job '{0}' has been edited by {1}", activeJob.JobTitle, "default user"); 071.thisAction.TimeStamp = DateTime.Now; 072.context.ActionHistories.Add(thisAction); 073.context.SubmitChanges(); 074.}, null); 075.} 076.else 077.{ 078.context.SubmitChanges((s) => 079.{ 080.ActionHistory thisAction = new ActionHistory(); 081.thisAction.PostingID = activeJob.PostingID; 082.thisAction.Description = String.Format("Job '{0}' has been added by {1}", activeJob.JobTitle, "default user"); 083.thisAction.TimeStamp = DateTime.Now; 084.context.ActionHistories.Add(thisAction); 085.context.SubmitChanges(); 086.}, null); 087.} 088.ShowJobsGrid(); 089.} 090.} 091.private void xCancelButton_Click(object sender, RoutedEventArgs e) 092.{ 093.ShowJobsGrid(); 094.} 095.private void ShowJobsGrid() 096.{ 097.xAddEditRecordButtonPanel.Visibility = Visibility.Visible; 098.xEditGrid.Visibility = Visibility.Collapsed; 099.xJobsGrid.Visibility = Visibility.Visible; 100.} 101.private void HideJobsGrid() 102.{ 103.xAddEditRecordButtonPanel.Visibility = Visibility.Collapsed; 104.xJobsGrid.Visibility = Visibility.Collapsed; 105.xEditGrid.Visibility = Visibility.Visible; 106.} 107.private void ShowErrors(List<string> errorList) 108.{ 109.string nm = "Errors received: \n"; 110.foreach (string anerror in errorList) 111.nm += anerror + "\n"; 112.RadWindow.Alert(nm); 113.} 114.} The first 39 lines should be pretty familiar, not doing anything too unorthodox to get this up and running. Once we hit the xAddEditButton_Click on line 40, we're still doing pretty much the same things except instead of checking the ValidationHelper errors, we both run a check on the current activeJob object as well as check the ValidationSummary errors list. Once that is set, we again use the callback of context.SubmitChanges (lines 68 and 78) to create an ActionHistory which we will use to track these items down the line. That's all? Essentially... yes. If you look back through this post, most of the code and adventures we have taken were just to get things working in the MVVM/Prism setup. Since I have the whole 'module' self-contained in a single JobView+code-behind setup, I don't have to worry about things like sending events off into space for someone to pick up, communicating through an Infrastructure project, or even re-inventing events to be used with attached behaviors. Everything just kinda works, and again with much less code. Here's a picture of the MVVM and Code-behind versions on the Jobs and AddEdit views, but since the functionality is the same in both apps you still cannot tell them apart (for two-strike): Looking ahead, the Applicants module is effectively the same thing as the Jobs module, so most of the code is being cut-and-pasted back and forth with minor tweaks here and there. So that one is being taken care of by me behind the scenes. Next time, we get into a new world of fun- the interview scheduling module, which will pull from available jobs and applicants for each interview being scheduled, tying everything together with RadScheduler to the rescue. Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Extending Oracle CEP with Predictive Analytics

    - by vikram.shukla(at)oracle.com
    Introduction: OCEP is often used as a business rules engine to execute a set of business logic rules via CQL statements, and take decisions based on the outcome of those rules. There are times where configuring rules manually is sufficient because an application needs to deal with only a small and well-defined set of static rules. However, in many situations customers don't want to pre-define such rules for two reasons. First, they are dealing with events with lots of columns and manually crafting such rules for each column or a set of columns and combinations thereof is almost impossible. Second, they are content with probabilistic outcomes and do not care about 100% precision. The former is the case when a user is dealing with data with high dimensionality, the latter when an application can live with "false" positives as they can be discarded after further inspection, say by a Human Task component in a Business Process Management software. The primary goal of this blog post is to show how this can be achieved by combining OCEP with Oracle Data Mining® and leveraging the latter's rich set of algorithms and functionality to do predictive analytics in real time on streaming events. The secondary goal of this post is also to show how OCEP can be extended to invoke any arbitrary external computation in an RDBMS from within CEP. The extensible facility is known as the JDBC cartridge. The rest of the post describes the steps required to achieve this: We use the dataset available at http://blogs.oracle.com/datamining/2010/01/fraud_and_anomaly_detection_made_simple.html to showcase the capabilities. We use it to show how transaction anomalies or fraud can be detected. Building the model: Follow the self-explanatory steps described at the above URL to build the model.  It is very simple - it uses built-in Oracle Data Mining PL/SQL packages to cleanse, normalize and build the model out of the dataset.  You can also use graphical Oracle Data Miner®  to build the models. To summarize, it involves: Specifying which algorithms to use. In this case we use Support Vector Machines as we're trying to find anomalies in highly dimensional dataset.Build model on the data in the table for the algorithms specified. For this example, the table was populated in the scott/tiger schema with appropriate privileges. Configuring the Data Source: This is the first step in building CEP application using such an integration.  Our datasource looks as follows in the server config file.  It is advisable that you use the Visualizer to add it to the running server dynamically, rather than manually edit the file.    <data-source>         <name>DataMining</name>         <data-source-params>             <jndi-names>                 <element>DataMining</element>             </jndi-names>             <global-transactions-protocol>OnePhaseCommit</global-transactions-protocol>         </data-source-params>         <connection-pool-params>             <credential-mapping-enabled></credential-mapping-enabled>             <test-table-name>SQL SELECT 1 from DUAL</test-table-name>             <initial-capacity>1</initial-capacity>             <max-capacity>15</max-capacity>             <capacity-increment>1</capacity-increment>         </connection-pool-params>         <driver-params>             <use-xa-data-source-interface>true</use-xa-data-source-interface>             <driver-name>oracle.jdbc.OracleDriver</driver-name>             <url>jdbc:oracle:thin:@localhost:1522:orcl</url>             <properties>                 <element>                     <value>scott</value>                     <name>user</name>                 </element>                 <element>                     <value>{Salted-3DES}AzFE5dDbO2g=</value>                     <name>password</name>                 </element>                                 <element>                     <name>com.bea.core.datasource.serviceName</name>                     <value>oracle11.2g</value>                 </element>                 <element>                     <name>com.bea.core.datasource.serviceVersion</name>                     <value>11.2.0</value>                 </element>                 <element>                     <name>com.bea.core.datasource.serviceObjectClass</name>                     <value>java.sql.Driver</value>                 </element>             </properties>         </driver-params>     </data-source>   Designing the EPN: The EPN is very simple in this example. We briefly describe each of the components. The adapter ("DataMiningAdapter") reads data from a .csv file and sends it to the CQL processor downstream. The event payload here is same as that of the table in the database (refer to the attached project or do a "desc table-name" from a SQL*PLUS prompt). While this is for convenience in this example, it need not be the case. One can still omit fields in the streaming events, and need not match all columns in the table on which the model was built. Better yet, it does not even need to have the same name as columns in the table, as long as you alias them in the USING clause of the mining function. (Caveat: they still need to draw values from a similar universe or domain, otherwise it constitutes incorrect usage of the model). There are two things in the CQL processor ("DataMiningProc") that make scoring possible on streaming events. 1.      User defined cartridge function Please refer to the OCEP CQL reference manual to find more details about how to define such functions. We include the function below in its entirety for illustration. <?xml version="1.0" encoding="UTF-8"?> <jdbcctxconfig:config     xmlns:jdbcctxconfig="http://www.bea.com/ns/wlevs/config/application"     xmlns:jc="http://www.oracle.com/ns/ocep/config/jdbc">        <jc:jdbc-ctx>         <name>Oracle11gR2</name>         <data-source>DataMining</data-source>               <function name="prediction2">                                 <param name="CQLMONTH" type="char"/>                      <param name="WEEKOFMONTH" type="int"/>                      <param name="DAYOFWEEK" type="char" />                      <param name="MAKE" type="char" />                      <param name="ACCIDENTAREA"   type="char" />                      <param name="DAYOFWEEKCLAIMED"  type="char" />                      <param name="MONTHCLAIMED" type="char" />                      <param name="WEEKOFMONTHCLAIMED" type="int" />                      <param name="SEX" type="char" />                      <param name="MARITALSTATUS"   type="char" />                      <param name="AGE" type="int" />                      <param name="FAULT" type="char" />                      <param name="POLICYTYPE"   type="char" />                      <param name="VEHICLECATEGORY"  type="char" />                      <param name="VEHICLEPRICE" type="char" />                      <param name="FRAUDFOUND" type="int" />                      <param name="POLICYNUMBER" type="int" />                      <param name="REPNUMBER" type="int" />                      <param name="DEDUCTIBLE"   type="int" />                      <param name="DRIVERRATING"  type="int" />                      <param name="DAYSPOLICYACCIDENT"   type="char" />                      <param name="DAYSPOLICYCLAIM" type="char" />                      <param name="PASTNUMOFCLAIMS" type="char" />                      <param name="AGEOFVEHICLES" type="char" />                      <param name="AGEOFPOLICYHOLDER" type="char" />                      <param name="POLICEREPORTFILED" type="char" />                      <param name="WITNESSPRESNT" type="char" />                      <param name="AGENTTYPE" type="char" />                      <param name="NUMOFSUPP" type="char" />                      <param name="ADDRCHGCLAIM"   type="char" />                      <param name="NUMOFCARS" type="char" />                      <param name="CQLYEAR" type="int" />                      <param name="BASEPOLICY" type="char" />                                     <return-component-type>char</return-component-type>                                                      <sql><![CDATA[             SELECT to_char(PREDICTION_PROBABILITY(CLAIMSMODEL, '0' USING *))               AS probability             FROM (SELECT  :CQLMONTH AS MONTH,                                            :WEEKOFMONTH AS WEEKOFMONTH,                          :DAYOFWEEK AS DAYOFWEEK,                           :MAKE AS MAKE,                           :ACCIDENTAREA AS ACCIDENTAREA,                           :DAYOFWEEKCLAIMED AS DAYOFWEEKCLAIMED,                           :MONTHCLAIMED AS MONTHCLAIMED,                           :WEEKOFMONTHCLAIMED,                             :SEX AS SEX,                           :MARITALSTATUS AS MARITALSTATUS,                            :AGE AS AGE,                           :FAULT AS FAULT,                           :POLICYTYPE AS POLICYTYPE,                            :VEHICLECATEGORY AS VEHICLECATEGORY,                           :VEHICLEPRICE AS VEHICLEPRICE,                           :FRAUDFOUND AS FRAUDFOUND,                           :POLICYNUMBER AS POLICYNUMBER,                           :REPNUMBER AS REPNUMBER,                           :DEDUCTIBLE AS DEDUCTIBLE,                            :DRIVERRATING AS DRIVERRATING,                           :DAYSPOLICYACCIDENT AS DAYSPOLICYACCIDENT,                            :DAYSPOLICYCLAIM AS DAYSPOLICYCLAIM,                           :PASTNUMOFCLAIMS AS PASTNUMOFCLAIMS,                           :AGEOFVEHICLES AS AGEOFVEHICLES,                           :AGEOFPOLICYHOLDER AS AGEOFPOLICYHOLDER,                           :POLICEREPORTFILED AS POLICEREPORTFILED,                           :WITNESSPRESNT AS WITNESSPRESENT,                           :AGENTTYPE AS AGENTTYPE,                           :NUMOFSUPP AS NUMOFSUPP,                           :ADDRCHGCLAIM AS ADDRCHGCLAIM,                            :NUMOFCARS AS NUMOFCARS,                           :CQLYEAR AS YEAR,                           :BASEPOLICY AS BASEPOLICY                 FROM dual)                 ]]>         </sql>        </function>     </jc:jdbc-ctx> </jdbcctxconfig:config> 2.      Invoking the function for each event. Once this function is defined, you can invoke it from CQL as follows: <?xml version="1.0" encoding="UTF-8"?> <wlevs:config xmlns:wlevs="http://www.bea.com/ns/wlevs/config/application">   <processor>     <name>DataMiningProc</name>     <rules>        <query id="q1"><![CDATA[                     ISTREAM(SELECT S.CQLMONTH,                                   S.WEEKOFMONTH,                                   S.DAYOFWEEK, S.MAKE,                                   :                                         S.BASEPOLICY,                                    C.F AS probability                                                 FROM                                 StreamDataChannel [NOW] AS S,                                 TABLE(prediction2@Oracle11gR2(S.CQLMONTH,                                      S.WEEKOFMONTH,                                      S.DAYOFWEEK,                                       S.MAKE, ...,                                      S.BASEPOLICY) AS F of char) AS C)                       ]]></query>                 </rules>               </processor>           </wlevs:config>   Finally, the last stage in the EPN prints out the probability of the event being an anomaly. One can also define a threshold in CQL to filter out events that are normal, i.e., below a certain mark as defined by the analyst or designer. Sample Runs: Now let's see how this behaves when events are streamed through CEP. We use only two events for brevity, one normal and other one not. This is one of the "normal" looking events and the probability of it being anomalous is less than 60%. Event is: eventType=DataMiningOutEvent object=q1  time=2904821976256 S.CQLMONTH=Dec, S.WEEKOFMONTH=5, S.DAYOFWEEK=Wednesday, S.MAKE=Honda, S.ACCIDENTAREA=Urban, S.DAYOFWEEKCLAIMED=Tuesday, S.MONTHCLAIMED=Jan, S.WEEKOFMONTHCLAIMED=1, S.SEX=Female, S.MARITALSTATUS=Single, S.AGE=21, S.FAULT=Policy Holder, S.POLICYTYPE=Sport - Liability, S.VEHICLECATEGORY=Sport, S.VEHICLEPRICE=more than 69000, S.FRAUDFOUND=0, S.POLICYNUMBER=1, S.REPNUMBER=12, S.DEDUCTIBLE=300, S.DRIVERRATING=1, S.DAYSPOLICYACCIDENT=more than 30, S.DAYSPOLICYCLAIM=more than 30, S.PASTNUMOFCLAIMS=none, S.AGEOFVEHICLES=3 years, S.AGEOFPOLICYHOLDER=26 to 30, S.POLICEREPORTFILED=No, S.WITNESSPRESENT=No, S.AGENTTYPE=External, S.NUMOFSUPP=none, S.ADDRCHGCLAIM=1 year, S.NUMOFCARS=3 to 4, S.CQLYEAR=1994, S.BASEPOLICY=Liability, probability=.58931702982118561 isTotalOrderGuarantee=true\nAnamoly probability: .58931702982118561 However, the following event is scored as an anomaly with a very high probability of  89%. So there is likely to be something wrong with it. A close look reveals that the value of "deductible" field (10000) is not "normal". What exactly constitutes normal here?. If you run the query on the database to find ALL distinct values for the "deductible" field, it returns the following set: {300, 400, 500, 700} Event is: eventType=DataMiningOutEvent object=q1  time=2598483773496 S.CQLMONTH=Dec, S.WEEKOFMONTH=5, S.DAYOFWEEK=Wednesday, S.MAKE=Honda, S.ACCIDENTAREA=Urban, S.DAYOFWEEKCLAIMED=Tuesday, S.MONTHCLAIMED=Jan, S.WEEKOFMONTHCLAIMED=1, S.SEX=Female, S.MARITALSTATUS=Single, S.AGE=21, S.FAULT=Policy Holder, S.POLICYTYPE=Sport - Liability, S.VEHICLECATEGORY=Sport, S.VEHICLEPRICE=more than 69000, S.FRAUDFOUND=0, S.POLICYNUMBER=1, S.REPNUMBER=12, S.DEDUCTIBLE=10000, S.DRIVERRATING=1, S.DAYSPOLICYACCIDENT=more than 30, S.DAYSPOLICYCLAIM=more than 30, S.PASTNUMOFCLAIMS=none, S.AGEOFVEHICLES=3 years, S.AGEOFPOLICYHOLDER=26 to 30, S.POLICEREPORTFILED=No, S.WITNESSPRESENT=No, S.AGENTTYPE=External, S.NUMOFSUPP=none, S.ADDRCHGCLAIM=1 year, S.NUMOFCARS=3 to 4, S.CQLYEAR=1994, S.BASEPOLICY=Liability, probability=.89171554529576691 isTotalOrderGuarantee=true\nAnamoly probability: .89171554529576691 Conclusion: By way of this example, we show: real-time scoring of events as they flow through CEP leveraging Oracle Data Mining.how CEP applications can invoke complex arbitrary external computations (function shipping) in an RDBMS.

    Read the article

  • How do I fix a custom Event Viewer Log that merges automatically with the Application log?

    - by NightOwl888
    I am trying to create a custom event log for a Windows Service on Windows Server 2003. I would like to name the custom log "(ML) Startup Commands". However, when I add a registry key with that name to HKLM\SYSTEM\CurrentControlSet\Services\Eventlog\, it adds a log but shows the exact same events that are in the Application log when looking in the event viewer. If I add a registry key with the name "(ML) Startup Commands 2" to the event log, it shows a blank event log as expected. In fact, any other name will work correctly except for the one I want. I have searched through the registry for other keys with the string "(ML)" and removed all other references to this key name, however I continue to get merged results in the viewer when I create a key with this name. My question is, how can I fix the server so I can create a custom event log with this name that shows only the events from my application, not the events from the default Application event log that is installed with Windows? Update: I rebooted the server and woudn't you know it, the log started acting normally. I got a strange error message in the Application log: The EventSystem sub system is suppressing duplicate event log entries for a duration of 86400 seconds. The suppression timeout can be controlled by a REG_DWORD value named SuppressDuplicateDuration under the following registry key: HKLM\Software\Microsoft\EventSystem\EventLog. For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp. I can only hope this error doesn't mean the problem will come back after 86400 seconds. I guess I will have to wait and see.

    Read the article

  • IIS Digest repeatedly asking for authentication

    - by David Budiac
    I have a development copy of an ASP.NET intranet site checked out and running on my local machine. We're using digest authentication to allow users to log in using their active directory accounts. On my development copy only, Digest sometimes will repeatedly prompt for login information usually ~9 times per page request. After repeatedly logging in (or it also works to cancel out of 8 out of the 9 prompts), I can use the site as normal. I cannot pinpoint what is triggering the issue. Sometimes this problem triggers upon the next page request, sometimes after I edited/saved/refreshed a page, and sometimes it doesn't happen at all. Each prompt triggers several logon (Event ID 4624 & 4672) security events in the Events Viewer. Shortly after each burst of logon events, I'll see a burst of logoff events (Event ID A co-worker who has a nearly an identical setup (Windows 7, IIS 7) is not experiencing the issue. Our production copy (that is running on a different server) also does not experience the issue. We've tried to compare our settings in IIS, not really finding any differences. I'm using chrome but I've experienced the issue in other browsers.

    Read the article

  • Drupal 6 node_view empty

    - by kristian nissen
    I'm trying to produce a page with a list of specific nodes but the node_view returns an empty string. This is my query: function events_upcoming() { $output = ''; $has_events = false; $res = pager_query(db_rewrite_sql("SELECT n.nid, n.created FROM {node} n WHERE n.type = 'events' AND n.status = 1 ORDER BY n.sticky DESC, n.created DESC"), variable_get('default_nodes_main', 10)); while ($n = db_fetch_object($res)) { $output .= node_view(node_load($n->nid), 1); $has_events = true; } if ($has_events) { $output .= theme('pager', NULL, variable_get('default_nodes_main', 10)); } return $output; } hook_menu (part of): 'events/upcoming' => array( 'title' => t('Upcoming Events'), 'page callback' => 'events_upcoming', 'access arguments' => array('access content'), 'type' => MENU_SUGGESTED_ITEM ), the implementation of hook_view: function events_view($node, $teaser = false, $page = false) { $node = node_prepare($node, $teaser); if ($page) { // TODO: Handle breadcrump } return $node; } now, if I add a var_dump($node) inside events_view the node is present and I can see the values I want, and if I add a var_dump inside while loop in events_upcoming I also get a node id from the query. the strange thing is, when I load localhost/events/upcoming I see the pager and nothing else. I have used the blog.module as a reference, but what am I missing here?

    Read the article

  • Developer Dashboard in SharePoint 2010

    - by jcortez
    Introducing the Developer Dashboard As a SharePoint developer (or IT Professional), how many times have you had the pleasure of figuring out why a particular page on your site is taking too long to render? I'm sure one of the techniques you have employed in troubleshooting is the process of elimination - removing individual web parts from the page hoping to identify which web part is misbehaving. One of the new features of SharePoint 2010 is the Developer Dashboard. This dashboard provides tracing and performance information that can be useful when you are trying to troubleshoot pages that are loading too slow. The Developer Dashboard is turned off by default and I'll go over 3 different ways to display it. Here is a screenshot of what the Developer Dashboard looks like when displayed at the bottom of the page:   You can see on the left side the different events that fired during the page processing pipeline and how long these events took. This is where you will see individual web parts being processed and how long it took to complete (obviously the kind of processing depends on what the web part does). On the right side you would see the different database calls issued through the SharePoint Object Model to process the page. You will notice that each of these database queries are actually a hyperlink and clicking on it displays a pop-up window that shows the actual SQL Query Text, the Call Stack that triggered the database call, and the IO statistics of that query. Enabling the Developer Dashboard Option 1: Managed Code   The Developer Dashboard is a farm-wide setting and the code above won't work if it is used within a web part hosted on any non-Central Admin site. The SPDeveloperDashboardLevel enum has three possible values: On, Off, and OnDemand. Setting it to On will always display the Developer Dashboard at the bottom of the page. Setting it Off will hide the Developer Dashboard. Setting it to OnDemand will add an icon at the top right corner of the page (see screenshot below) where a Site Collection Admin can toggle the display of the Developer Dashboard for a particular site collection. In my opinion, OnDemand is the best setting when troubleshooting a page or during development since a Site Collection Admin can turn it on or off and for a particular site only. The first cool thing about this is that the Site Collection Admin that turned it on will be the only one to see the Developer Dashboard output. Everyday users won't see the Developer Dashboard output even if it was turned on by a Site Collection Admin. If you need more flexibility on who gets to see the Developer Dashboard output, you can set the SPDeveloperDashboardSettings.RequiredPermissions to control which group of users will have the permission to see the output. Option 2: Using stsadm Using stsadm, you can run the following command to configure the Developer Dashboard: STSADM –o setproperty –pn developer-dashboard –pv OnDemand To successfully execute this command, be sure you that are running as a Farm Admin. Option 3: Using PowerShell For all scripts in SharePoint 2010, I prefer writing them as PowerShell scripts. Though the stsadm command is less verbose, the PowerShell equivalent is pretty straightforward and uses the SharePoint Object Model: You can of course parameterized the value that gets assigned to the DisplayLevel property so you can turn it On, Off or OnDemand depending on the parameter. Events and the Developer Dashboard  Now, don't assume that all the code inside your web part or page will show up in the Developer Dashboard complete with all the great troubleshooting information. Only a finite set of events are monitored by default (for a web part it will events in the base web part class). Let's say you have a click event that could take some time, for example a web service call. And you want to include troubleshooting information for this event in the Developer Dashboard. Enter SPMonitoredScope which is also a new feature in SharePoint 2010. In SharePoint 2010, everything is executed within a "Monitored Scope". And each scope has a set of "Monitors" that measures and counts calls and timings which appears in the Developer Dashboard. Below is an example on how to get your custom code to get included in the Developer Dashboard by wrapping it inside a new monitored scope: The code above would include your new scope "My long web service call" into the Developer Dashboard and would log the time it took to complete processing. In my opinion, wrapping your custom code in a SPMonitoredScope is a SharePoint development best practice since it provides you visibility and a better understanding on the performance of your components.

    Read the article

  • Non-Dom Element Event Binding with jQuery

    - by Rick Strahl
    Yesterday I had a short discussion with Dave Reed on Twitter regarding setting up fake ‘events’ on objects that are hookable. jQuery makes it real easy to bind events on DOM elements and with a little bit of extra work (that I didn’t know about) you can also set up binding to non-DOM element ‘event’ bindings. Assume for a second that you have a simple JavaScript object like this: var item = { sku: "wwhelp" , foo: function() { alert('orginal foo function'); } }; and you want to be notified when the foo function is called. You can use jQuery to bind the handler like this: $(item).bind("foo", function () { alert('foo Hook called'); } ); Binding alone won’t actually cause the handler to be triggered so when you call: item.foo(); you only get the ‘original’ message. In order to fire both the original handler and the bound event hook you have to use the .trigger() function: $(item).trigger("foo"); Now if you do the following complete sequence: var item = { sku: "wwhelp" , foo: function() { alert('orginal foo function'); } }; $(item).bind("foo", function () { alert('foo hook called'); } ); $(item).trigger("foo"); You’ll see the ‘hook’ message first followed by the ‘original’ message fired in succession. In other words, using this mechanism you can hook standard object functions and chain events to them in a way similar to the way you can do with DOM elements. The main difference is that the ‘event’ has to be explicitly triggered in order for this to happen rather than just calling the method directly. .trigger() relies on some internal logic that checks for event bindings on the object (attached via an expando property) which .trigger() searches for in its bound event list. Once the ‘event’ is found it’s called prior to execution of the original function. This is pretty useful as it allows you to create standard JavaScript objects that can act as event handlers and are effectively hookable without having to explicitly override event definitions with JavaScript function handlers. You get all the benefits of jQuery’s event methods including the ability to hook up multiple events to the same handler function and the ability to uniquely identify each specific event instance with post fix string names (ie. .bind("MyEvent.MyName") and .unbind("MyEvent.MyName") to bind MyEvent). Watch out for an .unbind() Bug Note that there appears to be a bug with .unbind() in jQuery that doesn’t reliably unbind an event and results in a elem.removeEventListener is not a function error. The following code demonstrates: var item = { sku: "wwhelp", foo: function () { alert('orginal foo function'); } }; $(item).bind("foo.first", function () { alert('foo hook called'); }); $(item).bind("foo.second", function () { alert('foo hook2 called'); }); $(item).trigger("foo"); setTimeout(function () { $(item).unbind("foo"); // $(item).unbind("foo.first"); // $(item).unbind("foo.second"); $(item).trigger("foo"); }, 3000); The setTimeout call delays the unbinding and is supposed to remove the event binding on the foo function. It fails both with the foo only value (both if assigned only as “foo” or “foo.first/second” as well as when removing both of the postfixed event handlers explicitly. Oddly the following that removes only one of the two handlers works: setTimeout(function () { //$(item).unbind("foo"); $(item).unbind("foo.first"); // $(item).unbind("foo.second"); $(item).trigger("foo"); }, 3000); this actually works which is weird as the code in unbind tries to unbind using a DOM method that doesn’t exist. <shrug> A partial workaround for unbinding all ‘foo’ events is the following: setTimeout(function () { $.event.special.foo = { teardown: function () { alert('teardown'); return true; } }; $(item).unbind("foo"); $(item).trigger("foo"); }, 3000); which is a bit cryptic to say the least but it seems to work more reliably. I can’t take credit for any of this – thanks to Dave Reed and Damien Edwards who pointed out some of these behaviors. I didn’t find any good descriptions of the process so thought it’d be good to write it down here. Hope some of you find this helpful.© Rick Strahl, West Wind Technologies, 2005-2010Posted in jQuery  

    Read the article

  • Chester Devs Presentation and source code &ndash; &lsquo;Event Store - an introduction to a DSD for event sourcing and notifications&rsquo;

    - by Liam Westley
    Originally posted on: http://geekswithblogs.net/twickers/archive/2013/11/11/chester-devs-presentation-and-source-code-ndash-lsquoevent-store.aspxThank you everyone at Chester Devs Thanks to Fran Hoey and all the people from Chester Devs. It was a hard drive up and back but the enthusiasm of the audience, with some great questions does make it worthwhile. Presentation and source code My presentation, source code, Event Store runners and text files containing the various command line parameters used for curl is now available on GitHub; https://github.com/westleyl/ChesterDevs-EventStore. Don’t worry if you don’t have a GitHub account, you don’t need one, you can just click on the Download Zip button on the right hand menu to download all the files as a single ZIP file.  If all you want is the PowerPoint presentation, go to https://github.com/westleyl/ChesterDevs-EventStore/blob/master/Powerpoint/Huddle-EventStore.pptx, and click on the View Raw button. Downloading and installing Event Store and Tools Download Event Store http://download.geteventstore.com – I unzipped these files into C:\EventStore\v2.0.1 Download Curl from http://curl.haxx.se/download.html – I downloaded Win64 Generic (with SSL) and unzipped these files into C:\curl version 7.31.0 Running the tools I used in my presentation Demonstration 1 (running Event Store) You can use one of my Event Store runner command files to run the single node version of Event Store, using default ports of 2213 for HTTP and 1113  for TCP, and with a wildcard HTTP pattern.  Both take a single command line parameter to specify the location of the data and log files.  The runners assume the single node executable is located in C:\EventStore\v2.0.1, and will placed data files and logs beneath C:\EventStore\Data, i.e. RunEventStore.cmd TestData1 This will create data files in C:\EventStore\Data\TestData1\Data and log files in C:\EventStore\Data\TestData1\logs. If, when running Event Store you may see the following message, [03288,15,06:23:00.622] Failed to start http server Access is denied You will either need to run Event Store in an administrator console window, or you can use the netsh command to create a firewall permission to allow HTTP listening (this will need to be run, once, in an administrator console window), netsh http add urlacl url=http://*:2213/ user=liam You can always delete this later by running the delete; netsh http delete urlacl url=http://*:2213/ If you want to confirm that everything is running OK, open the management console in a browser by navigating to http://127.0.0.1:2213. If at any point you are asked for a user name and password use the default of ‘admin’/‘changeit’. Demonstration 2 (reading and adding data, curl) In my second demonstration I used curl directly from the console to read streams, write events and then read back those events. On GitHub I have included is a set of curl commands, CurlCommandLine.txt, and a sample data file, SampleData.json, to load an event into a DDDNorth3 stream. As there is not much data in the Event Store at this point I used the $stats-127.0.0.1:2113 which is a stream containing performance statistics for Event Store and is updated every 30 seconds (default). Demonstration 3 (projections) On GitHub I have included a sample projection, Projection-ByRoom.txt, which will create streams based on the room on which a session was held on the DDDNorth3 agenda. Browse to the management console, http://127.0.0.1:2213.  Click on Projections, New Projection, give it a name, Sessions-ByRoom, and copy in the JavaScript in the Projection-ByRoom.txt file.  Select Continuous, tick Emit Enabled and then click on Post. It should run immediately. You may by challenged for the administration login for the management console, if so use the default user name and password; 'admin'/'changeit'. Demonstration 4 (C# client) The final demonstration was the Visual Studio 2012 project using the Event Store client – referenced directly as C:\EventStore\v2.0.1\EventStore.ClientAPI.dll, although you can switch this to the latest Event Store client NuGet package. The source code provides a console app for viewing projections with the projection manager (HTTP connection), as well as containing a full set of data for the entire DDDNorth3 agenda.  It also deals with the strategy for reading newest events backwards to older events and ignoring older events that have been superseded. Resources Event Store home page: http://www.geteventstore.com/ Event Store source code on GitHub: https://github.com/eventstore/eventstore Event Store documentation on GitHub: https://github.com/eventstore/eventstore/wiki (includes index to @RobAshton’s blog series on Event Store at https://github.com/eventstore/eventstore/wiki#rob-ashton---projections-series) Event Store forum in Google Groups: https://groups.google.com/forum/?fromgroups#!forum/event-store TopShelf Windows service wrapper is available on github: https://gist.github.com/trbngr/5083266

    Read the article

  • DDD North 3 Presentation and source code &ndash; &lsquo;Event Store - an introduction to a DSD for event sourcing and notifications&rsquo;

    - by Liam Westley
    Originally posted on: http://geekswithblogs.net/twickers/archive/2013/10/15/ddd-north-3-presentation-and-source-code-ndash-lsquoevent-store.aspxThank you everyone at DDD North Thanks to all the people who helped organise the cracking conference that is DDD North 3, returning to Sunderland, and the great facilities at the University of Sunderland, and the fine drinks reception at Sunderland Software City.  The whole event wouldn’t be possible without the sponsors who ensured over 400 people were kept fed and watered so they could enjoy the impressive range of sessions. And lastly, a thank you to all those delegates who gave up their free time on a Saturday to spend a day dashing between lecture rooms, including a late change to my room which saw 40 people having to brave a journey between buildings in the fine drizzle. The enthusiasm from the delegates always helps recharge my geek batteries. Presentation and source code My presentation, source code, Event Store runners and text files containing the various command line parameters used for curl is now available on GitHub; https://github.com/westleyl/DDDNorth3-EventStore. Don’t worry if you don’t have a GitHub account, you don’t need one, you can just click on the Download Zip button on the right hand menu to download all the files as a single ZIP file.  If all you want is the PowerPoint presentation, go to https://github.com/westleyl/DDDNorth3-EventStore/blob/master/Powerpoint/DDDNorth-EventStore.pptx, and click on the View Raw button. Downloading and installing Event Store and Tools Download Event Store http://download.geteventstore.com – I unzipped these files into C:\EventStore\v2.0.1 Download Curl from http://curl.haxx.se/download.html – I downloaded Win64 Generic (with SSL) and unzipped these files into C:\curl version 7.31.0 Running the tools I used in my presentation Demonstration 1 (running Event Store) You can use one of my Event Store runner command files to run the single node version of Event Store, using default ports of 2213 for HTTP and 1113  for TCP, and with a wildcard HTTP pattern.  Both take a single command line parameter to specify the location of the data and log files.  The runners assume the single node executable is located in C:\EventStore\v2.0.1, and will placed data files and logs beneath C:\EventStore\Data, i.e. RunEventStore.cmd TestData1 This will create data files in C:\EventStore\Data\TestData1\Data and log files in C:\EventStore\Data\TestData1\logs. If, when running Event Store you may see the following message, [03288,15,06:23:00.622] Failed to start http server Access is denied You will either need to run Event Store in an administrator console window, or you can use the netsh command to create a firewall permission to allow HTTP listening (this will need to be run, once, in an administrator console window), netsh http add urlacl url=http://*:2213/ user=liam You can always delete this later by running the delete; netsh http delete urlacl url=http://*:2213/ If you want to confirm that everything is running OK, open the management console in a browser by navigating to http://127.0.0.1:2213. If at any point you are asked for a user name and password use the default of ‘admin’/‘changeit’.   Demonstration 2 (reading and adding data, curl) In my second demonstration I used curl directly from the console to read streams, write events and then read back those events. On GitHub I have included is a set of curl commands, CurlCommandLine.txt, and a sample data file, SampleData.json, to load an event into a DDDNorth3 stream. As there is not much data in the Event Store at this point I used the $stats-127.0.0.1:2113 which is a stream containing performance statistics for Event Store and is updated every 30 seconds (default). Demonstration 3 (projections) On GitHub I have included a sample projection, Projection-ByRoom.txt, which will create streams based on the room on which a session was held on the DDDNorth3 agenda. Browse to the management console, http://127.0.0.1:2213.  Click on Projections, New Projection, give it a name, Sessions-ByRoom, and copy in the JavaScript in the Projection-ByRoom.txt file.  Select Continuous, tick Emit Enabled and then click on Post. It should run immediately. You may by challenged for the administration login for the management console, if so use the default user name and password; 'admin'/'changeit'.   Demonstration 4 (C# client) The final demonstration was the Visual Studio 2012 project using the Event Store client – referenced directly as C:\EventStore\v2.0.1\EventStore.ClientAPI.dll, although you can switch this to the latest Event Store client NuGet package. The source code provides a console app for viewing projections with the projection manager (HTTP connection), as well as containing a full set of data for the entire DDDNorth3 agenda.  It also deals with the strategy for reading newest events backwards to older events and ignoring older events that have been superseded. Resources Event Store home page: http://www.geteventstore.com/ Event Store source code on GitHub: https://github.com/eventstore/eventstore Event Store documentation on GitHub: https://github.com/eventstore/eventstore/wiki (includes index to @RobAshton’s blog series on Event Store at https://github.com/eventstore/eventstore/wiki#rob-ashton---projections-series) Event Store forum in Google Groups: https://groups.google.com/forum/?fromgroups#!forum/event-store TopShelf Windows service wrapper is available on github: https://gist.github.com/trbngr/5083266

    Read the article

  • Three Steps to Becoming an Expert Oracle Linux System Administrator

    - by Antoinette O'Sullivan
    Oracle provides a complete system administration curriculum to take you from your initial experience of Unix to being an expert Oracle Linux system administrator. You can take these live instructor-led courses from your own desk through live-virtual events or by traveling to an education center through in-class events. Step 1: Unix and Linux Essentials This 3-day course is designed for users and administrators who are new to Oracle Linux. It will help you develop the basic UNIX skills needed to interact comfortably and confidently with the operating system. Below is a sample of the in-class events already on the schedule.  Location  Date  Delivery Language  Vivoorde, Belgium  28 October 2013  English  Berlin, Germany  15 July 2013  German  Utrecht, Netherlands  19 August 2013  Dutch  Bucarest, Romania  12 August 2013  Romanian  Ankara, Turkey  6 January 2013  Turkish  Nairobi, Kenya  5 August 2013  English  Kaduna, Nigeria  15 July 2013  English   Woodmead, South Africa  15 July 2013  English   Jakarta, Indonesia  23 September 2013  English  Petaling Jaya, Malaysia  22 July 2013  English  Makati City, Philippines  3 July 2013  English  Bangkok, Thailand  20 November 2013  English  Auckland, New Zealand  5 August 2013  English  Melbourne, Australia  12 August 2013  English  Ottawa, Montreal, Toronto, Canada  3 September 2013  English  San Francisco and San Jose, CA, United States  15 July 2013  English  Reston, VA, United States  7 August 2013  English  Edison, NJ, and King of Prussia, PA, United States  3 September 2013  English  Denver, CO, United States  25 September 2013  English  Cambridge, MA, and Roseville MN, United States  6 November 2013  English  Phoenix, AZ, and Sacramento, CA, United States  25 November 2013  English Step 2: Oracle Linux System Administration Through this 5-day course, become a knowledgeable Oracle Linux system administrator, learning how to install Oracle Linux and the benefits of Oracle's Unbreakable Enterprise Kernel and Ksplice. Below is a sample of in-class events already on the schedule.  Location  Date  Delivery Language  Vienna, Austria  1 July 2013  German  Vivoorde, Belgium  18 November 2013  English  Zagreb, Croatia  16 September 2013  Croatian  London, England  3 September 2013  English  Manchester, England  9 September 2013  English  Paris, France  29 July 2013  French  Budapest, Hungary  8 July 2013  Hungarian  Utrecht, Netherland  2 September 2013  Dutch  Warsaw, Poland  15 July 2013  Polish  Bucharest, Romania  2 December 2013  Romanian  Ankara, Turkey  7 October 2013  Turkish  Istanbul, Turkey  9 September 2013  Turkish  Nairobi, Kenya  12 August 2013  English  Petaling Jaya, Malaysia  29 July 2013  English  Kuala Lumpur, Malaysia  21 October 2013  English  Makati City, Philippines  8 July 2013  English  Singapore  24 July 2013  English  Bangkok, Thailand  26 July 2013  English  Canberra, Australia  19 August 2013  English  Melbourne, Australia  16 September 2013  English   Sydney, Australia 19 August 2013   English   Mississauga, Canada  26 August 2013  English  Ottawa, Canada  4 November 2013  English  Phoenix, AZ, United States  7 October 2013  English  Belmont, CA, United States  23 September 2013  English  Irvine, CA, United States  18 November 2013  English  Sacramento, CA, United States  19 August 2013  English  San Francisco, CA, United States  15 July 2013  English  Denver, CO, United States  19 August 2013  English  Schaumburg, IL, United States  26 August 2013  English  Indianapolis, IN, United States  14 October 2013  English  Columbia, MD, United States  30 September 2013  English  Roseville, MN, United States  19 August 2013  English  St Louis, MO, United States  7 October 2013  English  Edison, NJ, United States  28 October 2013  English  Beaverton, OR, United States  12 August 2013  English  Pittsburg, PA, United States 9 December 2013   English  Reston, VA, United States 12 August 2013   English  Brookfield, WI, United States 30 September 2013   English  Sao Paolo, Brazil 15 July 2013   Brazilian Portugese Step 3: Oracle Linux Advanced System Administration This new 3-day course is ideal for administrators who want to learn about managing resources and file systems while developing troubleshooting and advanced storage administration skills. You will learn about Linux Containers, Cgroups, btrfs, DTrace and more. Below is a sample of in-class events already on the schedule.  Location  Date  Delivery Language  Melbourne, Australia  9 October 2013  English  Roseville, MN, United States  3 September 2013  English To register for or learn more about these courses, go to http://oracle.com/education/linux. Watch this video to learn more about Oracle's operating system training.

    Read the article

  • CQRS - Benefits

    - by Dylan Smith
    Thanks to all the comments and feedback from the last post I think I have a better understanding now of the benefits of CQRS (separate from the benefits of Event Sourcing). I’m going to try and sum it up here, and point out some areas where I could still use some advice: CQRS Benefits Sounds like the primary benefit of CQRS as an architecture is it allows you to create a simpler domain model by sucking out everything related to queries. I can definitely see the benefit to this, in general the domain logic related to commands is the high-value behavior in the software, but the logic required to service the queries would add a lot of low-value “noise” to the domain model that would dilute the high-value (command) behavior – sorting, paging, filtering, pre-fetch paths, etc. Also the most appropriate domain structure for implementing commands might not be the most optimal for implementing queries. To paraphrase Greg, this usually results in a domain model that is mediocre at both, piss-poor at one, or more likely piss-poor at both commands and queries. Not only will you be able to simplify your domain model by pulling out all the query logic, but at least a handful of commands in most systems will probably be “pass-though” type commands with little to no logic that just generate events. If these can be implemented directly in the command-handler and never touch the domain model, this allows you to slim down the domain model even more. Also, if you were to do event sourcing without CQRS, you no longer have a database containing the current state (only the domain model would) which makes it difficult (or impossible) to support ad-hoc querying and/or reporting that is common in most business software. Of course CQRS provides some great scalability benefits, not only scalability but I have to assume that it provides extremely low latency for most operations, especially if you have an asynchronous event bus. I know Greg says that you get a 3x scaling (Commands, Queries, Client) of your ability to perform parallel development, but IMHO, it seems like it only provides 1.5x scaling since even without CQRS you’re going to have your client loosely coupled to your domain - which is still a great benefit to be able to realize. Questions / Concerns If all the queries against an aggregate get pulled out to the Query layer, what if the only commands for that aggregate can be handled in a “pass-through” manner with the command handler directly generating events. Is it possible to have an aggregate that isn’t modeled in the domain model? Are there any issues or downsides to this? I know in the feedback from my previous posts it was suggested that having one domain model handling both commands and queries requires implementing a lot of traversals between objects that wouldn’t be necessary if it was only servicing commands. My question is, do you include traversals in your domain model based on the needs of the code, or based on the conceptual domain model? If none of my Commands require a Customer.Orders traversal, but the conceptual domain includes the concept of a set of orders belonging to a customer – should I model that in my domain model or not? I like the idea of using the Query side of the architecture as a place to put junior devs where the risk of them screwing something up has minimal impact. But I’m not sold on the idea that you can actually outsource it. Like I said in one of my comments on my previous post, the code to handle a query and generate DTO’s is going to be dead simple, but the code to process events and apply them to the tables on the query side is going to require a significant amount of domain knowledge to know which events to listen for to update each of the de-normalized tables (and what changes need to be made when each event is processed). I don’t know about everybody else, but having Indian/Russian/whatever outsourced developers have to do anything that requires significant domain knowledge has never been successful in my experience. And if you need to spec out for each new query which events to listen to and what to do with each one, well that’s probably going to be just as much work to document as it would be to just implement it. Greg made the point in a comment that doing an aggregate query like “Total Sales By Customer” is going to be inefficient if you use event sourcing but not CQRS. I don’t understand why that would be the case. I imagine in that case you’d simply have a method/property on the Customer object that calculated total sales for that customer by enumerating over the Orders collection. Then the application services layer would generate DTO’s off of the Customers collection that included say the CustomerID, CustomerName, TotalSales, or whatever the case may be. As long as you use a snapshotting implementation, I don’t see why that would be anymore inefficient in a DDD+Event Sourcing implementation than in a typical DDD implementation. Like I mentioned in my last post I still have some questions about query logic that haven’t been answered yet, but before I start asking those I want to make sure I have a strong grasp on what benefits CQRS provides.  My main concern with the query logic was that I know I could just toss it all into the query side, but I was concerned that I would be losing the benefits of using CQRS in the first place if I did that.  I want to elaborate more on this though with some example situations in an upcoming post.

    Read the article

  • Introducing sp_ssiscatalog (v1.0.0.0)

    - by jamiet
    Regular readers of my blog may know that over the last year I have made available a suite of SQL Server Reporting Services (SSRS) reports that provide visualisations of the data in the SQL Server Integration Services (SSIS) 2012 Catalog. Those reports are available at http://ssisreportingpack.codeplex.com. As I have built these reports and used them myself on a real life project a couple of things have dawned on me: As soon as your SSIS Catalog gets a significant amount of data in it the performance of the reports degrades rapidly. This is hampered by the fact that there are limitations as to the SQL statements that I can embed within a SSRS report. SSIS professionals are data guys at heart and those types of people feel more comfortable in a query environment rather than having to go through the rigmarole of standing up a reporting server (well, I know I do anyway) Hence I have decided to take a different tack with the reporting pack. Taking my lead from Adam Machanic’s sp_whoisactive and Brent Ozar’s sp_blitz I have produced sp_ssiscatalog, a stored procedure that makes it easy to get at the crucial data in the SSIS Catalog. I will spend the rest of this blog explaining exactly what sp_ssiscatalog does and how to use it but if you would rather just download the bits yourself and start to play you can download v1.0.0.0 from DB v1.0.0.0. Usage Scenarios Most Recent Execution I find that the most frequent information that one needs to get from the SSIS Catalog is information pertaining to the most recent execution. Hence if you execute sp_ssiscatalog with no parameters, that is exactly what you will get. EXEC [dbo].[sp_ssiscatalog] This will return up to 5 resultsets: EXECUTION - Summary information about the execution including status, start time & end time EVENTS - All events that occurred during the execution OnError,OnTaskFailed - All events where event_name is either OnError or OnTaskFailed OnWarning - All events where event_name is OnWarning EXECUTABLE_STATS - Duration and execution result of every executable in the execution All 5 resultsets will be displayed if there is any data satisfying that resultset. In other words, if there are no (for example) OnWarning events then the OnWarning resultset will not be displayed. The display of these 5 resultsets can be toggled respectively by these 5 optional parameters (all of which are of type BIT): @exec_execution @exec_events @exec_errors @exec_warnings @exec_executable_stats Any Execution As just explained the default behaviour is to supply data for the most recent execution. If you wish to specify which execution the data should return data for simply supply the execution_id as a parameter: EXEC [dbo].[sp_ssiscatalog] 6 All Executions sp_ssiscatalog can also return information about all executions: EXEC [dbo].[sp_ssiscatalog] @operation_type='execs' The most recent execution will appear at the top. sp_ssiscatalog provides a number of parameters that enable you to filter the resultset: @execs_folder_name @execs_project_name @execs_package_name @execs_executed_as_name @execs_status_desc Some typical usages might be: //Return all failed executions EXEC [dbo].[sp_ssiscatalog] @operation_type='execs',@execs_status_desc='failed' //Return all executions for a specified folder EXEC [dbo].[sp_ssiscatalog] @operation_type='execs',@execs_folder_name='My folder' //Return all executions of a specified package in a specified project EXEC [dbo].[sp_ssiscatalog] @operation_type='execs',@execs_project_name='My project', @execs_package_name='Pkg.dtsx' Installing sp_ssicatalog Under the covers sp_ssiscatalog actually calls many other stored procedures and functions hence creating it on your server is not simply a case of running a CREATE PROCEDURE script. I maintain the code in an SQL Server Data Tools (SSDT) database project which means that you have two ways of obtaining it. Download the source code You can download the latest (at the time of writing) source code from http://ssisreportingpack.codeplex.com/SourceControl/changeset/view/70192. Hit the download button to download all the source code in a zip file. The contents of that zip file will include an SSDT database project which you can open up in SSDT and publish just like any other SSDT database project. You can publish to a new database or any existing database, even [SSISDB] if you prefer. Download a dacpac Maintaining the code in an SSDT database project means that it can all get packaged up into a dacpac that you can then publish to your SQL Server. That dacpac is available from DB v1.0.0.0: Ordinarily a dacpac can be deployed to a SQL Server from SSMS using the Deploy Dacpac wizard however in this case there is a limitation. Due to sp_ssiscatalog referring to objects in the SSIS Catalog (which it has to do of course) the dacpac contains a SqlCmd variable to store the name of the database that underpins the SSIS Catalog; unfortunately the Deploy Dacpac wizard in SSMS has a rather gaping limitation in that it cannot deploy dacpacs containing SqlCmd variables. Hence, we can use the command-line tool, sqlpackage.exe, instead. Don’t worry if reverting to the command-line sounds a little daunting, I assure you it is not. Simply open a Visual Studio command-prompt and cd to the folder containing the downloaded dacpac: Type: "%PROGRAMFILES(x86)%\Microsoft SQL Server\110\DAC\bin\sqlpackage.exe" /action:Publish /TargetDatabaseName:SsisReportingPack /SourceFile:SSISReportingPack.dacpac /Variables:SSISDB=SSISDB /TargetServerName:(local) or the shortened form: "%PROGRAMFILES(x86)%\Microsoft SQL Server\110\DAC\bin\sqlpackage.exe" /a:Publish /tdn:SsisReportingPack /sf:SSISReportingPack.dacpac /v:SSISDB=SSISDB /tsn:(local) remembering to set your server name appropriately (here mine is set to “(local)” ). If everything works successfully you will see this: And you’re done! You’ll have a new database called [SsisReportingPack] which contains sp_ssiscatalog:   Good luck with sp_ssiscatalog. I have been using it extensively on my own projects recently and it has proved to be very useful indeed. Rest-assured however, I will be adding many new capabilities in the future. Feedback is welcome. @Jamiet

    Read the article

  • Possible to manipulate UI elements via dispatchEvent()?

    - by rinogo
    Hi all! I'm trying to manually dispatch events on a textfield so I can manipulate it indirectly via code (e.g. place cursor at a given set of x/y coordinates). However, my events seem to have no effect. I've written a test to experiment with this phenomenon: package sandbox { import flash.display.Sprite; import flash.events.MouseEvent; import flash.text.TextField; import flash.text.TextFieldType; import flash.text.TextFieldAutoSize; import flash.utils.setTimeout; public class Test extends Sprite { private var tf:TextField; private var tf2:TextField; public function Test() { super(); tf = new TextField(); tf.text = 'Interact here'; tf.type = TextFieldType.INPUT; addChild(tf); tf2 = new TextField(); tf2.text = 'Same events replayed with five second delay here'; tf2.autoSize = TextFieldAutoSize.LEFT; tf2.type = TextFieldType.INPUT; tf2.y = 30; addChild(tf2); tf.addEventListener(MouseEvent.CLICK, mouseListener); tf.addEventListener(MouseEvent.DOUBLE_CLICK, mouseListener); tf.addEventListener(MouseEvent.MOUSE_DOWN, mouseListener); tf.addEventListener(MouseEvent.MOUSE_MOVE, mouseListener); tf.addEventListener(MouseEvent.MOUSE_OUT, mouseListener); tf.addEventListener(MouseEvent.MOUSE_OVER, mouseListener); tf.addEventListener(MouseEvent.MOUSE_UP, mouseListener); tf.addEventListener(MouseEvent.MOUSE_WHEEL, mouseListener); tf.addEventListener(MouseEvent.ROLL_OUT, mouseListener); tf.addEventListener(MouseEvent.ROLL_OVER, mouseListener); } private function mouseListener(event:MouseEvent):void { //trace(event); setTimeout(function():void {trace(event); tf2.dispatchEvent(event);}, 5000); } } } Essentially, all this test does is to use setTimeout to effectively 'record' events on TextField tf and replay them five seconds later on TextField tf2. When an event is dispatched on tf2, it is traced to the console output. The console output upon running this program and clicking on tf is: [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=0 localY=1 stageX=0 stageY=1 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="rollOver" bubbles=false cancelable=false eventPhase=2 localX=0 localY=1 stageX=0 stageY=1 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseOver" bubbles=true cancelable=false eventPhase=3 localX=0 localY=1 stageX=0 stageY=1 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=2 localY=1 stageX=2 stageY=1 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=2 localY=2 stageX=2 stageY=2 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=2 localY=3 stageX=2 stageY=3 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=3 localY=3 stageX=3 stageY=3 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=5 localY=3 stageX=5 stageY=3 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=6 localY=5 stageX=6 stageY=5 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=7 localY=5 stageX=7 stageY=5 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=9 localY=5 stageX=9 stageY=5 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=10 localY=5 stageX=10 stageY=5 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=11 localY=5 stageX=11 stageY=5 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=12 localY=5 stageX=12 stageY=5 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseDown" bubbles=true cancelable=false eventPhase=3 localX=12 localY=5 stageX=12 stageY=5 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseUp" bubbles=true cancelable=false eventPhase=3 localX=12 localY=5 stageX=12 stageY=5 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="click" bubbles=true cancelable=false eventPhase=3 localX=12 localY=5 stageX=12 stageY=5 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=10 localY=4 stageX=10 stageY=4 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=9 localY=2 stageX=9 stageY=2 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseMove" bubbles=true cancelable=false eventPhase=3 localX=9 localY=1 stageX=9 stageY=1 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="mouseOut" bubbles=true cancelable=false eventPhase=3 localX=-1 localY=-1 stageX=-1 stageY=-1 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] [MouseEvent type="rollOut" bubbles=false cancelable=false eventPhase=2 localX=-1 localY=-1 stageX=-1 stageY=-1 relatedObject=null ctrlKey=false altKey=false shiftKey=false delta=0] As we can see, the events are being captured and replayed successfully. However, no change occurs in tf2 - the mouse cursor does not appear in tf2 as we would expect. In fact, the cursor remains in tf even after the tf2 events are dispatched. Please help! Thanks, -Rich

    Read the article

  • can't install psycopg2 in my env on mac os x lion

    - by Alexander Ovchinnikov
    I tried install psycopg2 via pip in my virtual env, but got this error: ld: library not found for -lpq (full log here: http://pastebin.com/XdmGyJ4u ) I tried install postgres 9.1 from .dmg and via port, (gksks)iMac-Alexander:~ lorddaedra$ locate libpq /Developer/SDKs/MacOSX10.7.sdk/usr/include/libpq /Developer/SDKs/MacOSX10.7.sdk/usr/include/libpq/libpq-fs.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/libpq-events.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/libpq-fe.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/internal/libpq /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/internal/libpq/pqcomm.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/internal/libpq-int.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/server/libpq /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/server/libpq/auth.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/server/libpq/be-fsstubs.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/server/libpq/crypt.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/server/libpq/hba.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/server/libpq/ip.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/server/libpq/libpq-be.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/server/libpq/libpq-fs.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/server/libpq/libpq.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/server/libpq/md5.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/server/libpq/pqcomm.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/server/libpq/pqformat.h /Developer/SDKs/MacOSX10.7.sdk/usr/include/postgresql/server/libpq/pqsignal.h /Developer/SDKs/MacOSX10.7.sdk/usr/lib/libpq.5.3.dylib /Developer/SDKs/MacOSX10.7.sdk/usr/lib/libpq.5.dylib /Developer/SDKs/MacOSX10.7.sdk/usr/lib/libpq.a /Developer/SDKs/MacOSX10.7.sdk/usr/lib/libpq.dylib /Library/PostgreSQL/9.1/doc/postgresql/html/install-windows-libpq.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-async.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-build.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-cancel.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-connect.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-control.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-copy.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-envars.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-events.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-example.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-exec.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-fastpath.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-ldap.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-misc.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-notice-processing.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-notify.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-pgpass.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-pgservice.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-ssl.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-status.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq-threading.html /Library/PostgreSQL/9.1/doc/postgresql/html/libpq.html /Library/PostgreSQL/9.1/include/libpq /Library/PostgreSQL/9.1/include/libpq/libpq-fs.h /Library/PostgreSQL/9.1/include/libpq-events.h /Library/PostgreSQL/9.1/include/libpq-fe.h /Library/PostgreSQL/9.1/include/postgresql/internal/libpq /Library/PostgreSQL/9.1/include/postgresql/internal/libpq/pqcomm.h /Library/PostgreSQL/9.1/include/postgresql/internal/libpq-int.h /Library/PostgreSQL/9.1/include/postgresql/server/libpq /Library/PostgreSQL/9.1/include/postgresql/server/libpq/auth.h /Library/PostgreSQL/9.1/include/postgresql/server/libpq/be-fsstubs.h /Library/PostgreSQL/9.1/include/postgresql/server/libpq/crypt.h /Library/PostgreSQL/9.1/include/postgresql/server/libpq/hba.h /Library/PostgreSQL/9.1/include/postgresql/server/libpq/ip.h /Library/PostgreSQL/9.1/include/postgresql/server/libpq/libpq-be.h /Library/PostgreSQL/9.1/include/postgresql/server/libpq/libpq-fs.h /Library/PostgreSQL/9.1/include/postgresql/server/libpq/libpq.h /Library/PostgreSQL/9.1/include/postgresql/server/libpq/md5.h /Library/PostgreSQL/9.1/include/postgresql/server/libpq/pqcomm.h /Library/PostgreSQL/9.1/include/postgresql/server/libpq/pqformat.h /Library/PostgreSQL/9.1/include/postgresql/server/libpq/pqsignal.h /Library/PostgreSQL/9.1/lib/libpq.5.4.dylib /Library/PostgreSQL/9.1/lib/libpq.5.dylib /Library/PostgreSQL/9.1/lib/libpq.a /Library/PostgreSQL/9.1/lib/libpq.dylib /Library/PostgreSQL/9.1/lib/postgresql/libpqwalreceiver.so /Library/PostgreSQL/9.1/pgAdmin3.app/Contents/Frameworks/libpq.5.dylib /Library/PostgreSQL/psqlODBC/lib/libpq.5.4.dylib /Library/PostgreSQL/psqlODBC/lib/libpq.5.dylib /Library/PostgreSQL/psqlODBC/lib/libpq.dylib /Library/WebServer/Documents/postgresql/html/install-windows-libpq.html /Library/WebServer/Documents/postgresql/html/libpq-async.html /Library/WebServer/Documents/postgresql/html/libpq-build.html /Library/WebServer/Documents/postgresql/html/libpq-cancel.html /Library/WebServer/Documents/postgresql/html/libpq-connect.html /Library/WebServer/Documents/postgresql/html/libpq-control.html /Library/WebServer/Documents/postgresql/html/libpq-copy.html /Library/WebServer/Documents/postgresql/html/libpq-envars.html /Library/WebServer/Documents/postgresql/html/libpq-events.html /Library/WebServer/Documents/postgresql/html/libpq-example.html /Library/WebServer/Documents/postgresql/html/libpq-exec.html /Library/WebServer/Documents/postgresql/html/libpq-fastpath.html /Library/WebServer/Documents/postgresql/html/libpq-ldap.html /Library/WebServer/Documents/postgresql/html/libpq-misc.html /Library/WebServer/Documents/postgresql/html/libpq-notice-processing.html /Library/WebServer/Documents/postgresql/html/libpq-notify.html /Library/WebServer/Documents/postgresql/html/libpq-pgpass.html /Library/WebServer/Documents/postgresql/html/libpq-pgservice.html /Library/WebServer/Documents/postgresql/html/libpq-ssl.html /Library/WebServer/Documents/postgresql/html/libpq-status.html /Library/WebServer/Documents/postgresql/html/libpq-threading.html /Library/WebServer/Documents/postgresql/html/libpq.html /opt/local/include/postgresql90/internal/libpq /opt/local/include/postgresql90/internal/libpq/pqcomm.h /opt/local/include/postgresql90/internal/libpq-int.h /opt/local/include/postgresql90/libpq /opt/local/include/postgresql90/libpq/libpq-fs.h /opt/local/include/postgresql90/libpq-events.h /opt/local/include/postgresql90/libpq-fe.h /opt/local/include/postgresql90/server/libpq /opt/local/include/postgresql90/server/libpq/auth.h /opt/local/include/postgresql90/server/libpq/be-fsstubs.h /opt/local/include/postgresql90/server/libpq/crypt.h /opt/local/include/postgresql90/server/libpq/hba.h /opt/local/include/postgresql90/server/libpq/ip.h /opt/local/include/postgresql90/server/libpq/libpq-be.h /opt/local/include/postgresql90/server/libpq/libpq-fs.h /opt/local/include/postgresql90/server/libpq/libpq.h /opt/local/include/postgresql90/server/libpq/md5.h /opt/local/include/postgresql90/server/libpq/pqcomm.h /opt/local/include/postgresql90/server/libpq/pqformat.h /opt/local/include/postgresql90/server/libpq/pqsignal.h /opt/local/lib/postgresql90/libpq.5.3.dylib /opt/local/lib/postgresql90/libpq.5.dylib /opt/local/lib/postgresql90/libpq.a /opt/local/lib/postgresql90/libpq.dylib /opt/local/lib/postgresql90/libpqwalreceiver.so /opt/local/var/macports/sources/rsync.macports.org/release/tarballs/ports/databases/libpqxx /opt/local/var/macports/sources/rsync.macports.org/release/tarballs/ports/databases/libpqxx/Portfile /opt/local/var/macports/sources/rsync.macports.org/release/tarballs/ports/databases/libpqxx26 /opt/local/var/macports/sources/rsync.macports.org/release/tarballs/ports/databases/libpqxx26/Portfile /usr/include/libpq /usr/include/libpq/libpq-fs.h /usr/include/libpq-events.h /usr/include/libpq-fe.h /usr/include/postgresql/internal/libpq /usr/include/postgresql/internal/libpq/pqcomm.h /usr/include/postgresql/internal/libpq-int.h /usr/include/postgresql/server/libpq /usr/include/postgresql/server/libpq/auth.h /usr/include/postgresql/server/libpq/be-fsstubs.h /usr/include/postgresql/server/libpq/crypt.h /usr/include/postgresql/server/libpq/hba.h /usr/include/postgresql/server/libpq/ip.h /usr/include/postgresql/server/libpq/libpq-be.h /usr/include/postgresql/server/libpq/libpq-fs.h /usr/include/postgresql/server/libpq/libpq.h /usr/include/postgresql/server/libpq/md5.h /usr/include/postgresql/server/libpq/pqcomm.h /usr/include/postgresql/server/libpq/pqformat.h /usr/include/postgresql/server/libpq/pqsignal.h /usr/lib/libpq.5.3.dylib /usr/lib/libpq.5.dylib /usr/lib/libpq.a /usr/lib/libpq.dylib How to tell pip to use this lib in /Library/PostgreSQL/9.1/lib/ (or may be in /usr/lib)? or may be install this lib again in my env (i try keep my env isolated from mac as possible)

    Read the article

  • lsyncd + csync2 : cluster of 3 or more nodes

    - by sbrattla
    I've got 3 (and potentially more) web servers hosting the same content (fronted by a load balancer). Thus, I need to make sure that files on these web servers are the same. It appears that csync2 in combination with lsyncd is able to do synchronize a cluster of nodes, but according to this article there's a problem with cyclic events in such a setup. In other words, the author writes that a file change on one machine would trigger a replication event to other machines, which again would trigger a replication event back to the original machine. It appears that this is a consequence of the setup which uses lsyncd (and inotify) to catch file modification events and from there trigger csync2 to replicate the file tree. Does anyone have experience with lsyncd in combination with csync2. Have you had trouble with cyclic events?

    Read the article

  • How to resolve "HTTP/1.1 403 Forbidden" errors from iCal/CalDAV server after upgrade to Snow Leopard Server?

    - by morgant
    We recently upgraded our Open Directory Master & Replica to Mac OS X 10.6.4 Snow Leopard Server. We had a mismatched server FQDN & LDAP Search Base/Kerberos Realm, so we exported all users & groups, created the new Open Directory Master w/matching FQDN & Search Base/Realm, reimported users & groups, and re-bound all servers & workstations to the new OD Master. At the same time as all of this, we upgraded our iCal/CalDAV server to Mac OS X 10.6.4 Snow Leopard Server. Ever since doing so, we've seen the following issues with our iCal/CalDAV server and iCal clients on both Mac OS X 10.5 Leopard & Mac OS X 10.6: If a user attempts to move or delete an event (single or repeating) that was created prior to the upgrade to 10.6 Server, they get the following error: Access to "blah" in "blah" in account "blah" is not permitted. The server responded: "HTTP/1.1 403 Forbidden" to operation CalDAVWriteEntityQueueableOperation. New users added to the directory get the following error when attempting to add their account to in iCal's preferences: The user "blah" has no configured pricipals. Confirm with your network administrator that your account has at least one CalDAV principal configured. Interestingly, we've since discovered that users seem to be able to delete individual events from an old repeating event without error, but that's a massive amount of work to get rid of a repeating event. I will note that we have not yet added an SRV record in DNS as instructed on page 19 of iCal_Server_Admin_v10.6.pdf. Further Investigation: In this particular case, a user is attempting to decline repeating events created prior to the upgrade to Snow Leopard Server. Granting the user full write access with sudo calendarserver_manage_principals --add-write-proxy users:user1 users:user2 (which did work) doesn't allow deletion of the events. Still get the usual error: Access to "blah blah" in "blah blah" in account "blah blah" is not permitted. The server responded: "HTTP/1.1 403 Forbidden" to operation CalDAVWriteEntityQueueableOperation. The error that shows up in /var/log/caldavd/error.log on the iCal Server when attempting to delete one of the events is: 2011-03-17 15:14:30-0400 [-] [caldav-8009] [PooledMemCacheProtocol,client] [twistedcaldav.extensions#info] PUT /calendars/__uids__/XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/calendar/XXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.ics HTTP/1.1 2011-03-17 15:14:30-0400 [-] [caldav-8009] [PooledMemCacheProtocol,client] [twistedcaldav.scheduling.implicit#error] Cannot change ORGANIZER: UID:XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX And the error in /var/log/system.log on the client is: Mar 17 15:14:30 192-168-21-169-dhcp iCal[33509]: CalDAV CalDAVWriteEntityQueueableOperation failed: status 'HTTP/1.1 403 Forbidden' request:\n\nBEGIN:VCALENDAR^M\nVERSION:2.0^M\nPRODID:-//Apple Inc.//iCal 3.0//EN^M\nCALSCALE:GREGORIAN^M\nBEGIN:VTIMEZONE^M\nTZID:US/Eastern^M\nBEGIN:DAYLIGHT^M\nTZOFFSETFROM:-0500^M\nTZOFFSETTO:-0400^M\nDTSTART:20070311T020000^M\nRRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU^M\nTZNAME:EDT^M\nEND:DAYLIGHT^M\nBEGIN:STANDARD^M\nTZOFFSETFROM:-0400^M\nTZOFFSETTO:-0500^M\nDTSTART:20071104T020000^M\nRRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU^M\nTZNAME:EST^M\nEND:STANDARD^M\nEND:VTIMEZONE^M\nBEGIN:VEVENT^M\nSEQUENCE:5^M\nDTSTART;TZID=US/Eastern:20090117T094500^M\nDTSTAMP:20081227T143043Z^M\nSUMMARY:blah blah^M\nATTENDEE;CN="First Last";CUTYPE=INDIVIDUAL;ROLE=REQ-PARTICIPANT:urn:uuid^M\n :XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX^M\nATTENDEE;CN="First Last";CUTYPE=INDIVIDUAL;PARTSTAT=ACCEPTED:mailto:user@d^M\n omain.tld^M\nEXDATE;TZID=US/Eastern:20110319T094500^M\nDTEND;TZID=US/Eastern:20090117T183000^M\nRRULE:FREQ=WEEKLY;INTERVAL=1^M\nTRANSP:OPAQUE^M\nUID:XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX^M\nORGANIZER;CN="First Last":mailto:[email protected]^M\nX-WR-ITIPSTATUSML:UNCLEAN^M\nCREATED:20110317T191348Z^M\nEND:VEVENT^M\nEND:VCALENDAR^M\n\n\n... response:\nHTTP/1.1 403 Forbidden^M\nDate: Thu, 17 Mar 2011 19:14:30 GMT^M\nDav: 1, access-control, calendar-access, calendar-schedule, calendar-auto-schedule, calendar-availability, inbox-availability, calendar-proxy, calendarserver-private-events, calendarserver-private-comments, calendarserver-principal-property-search^M\nContent-Type: text/xml^M\nContent-Length: 134^M\nServer: Twisted/8.2.0 TwistedWeb/8.2.0 TwistedCalDAV/2.5 (iCal Server v12.56.21)^M\n^M\n<?xml version='1.0' encoding='UTF-8'?><error xmlns='DAV:'>^M\n <valid-attendee-change xmlns='urn:ietf:params:xml:ns:caldav'/>^M\n</error> One thing I have noticed, and I'm not sure if this has any real effect is that in many of these pre-Snow Leopard Server migration events, the ORGANIZER is specified like the following: ORGANIZER;CN=First Last:mailto:[email protected] But newer ones are more like one of the two following: ORGANIZER;CN=First Last;[email protected];SCHEDULE-STATUS=1 ORGANIZER;CN=First Last;[email protected]:urn:uuid:XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX iCal_Server_Admin_v10.6.pdf notes that the ".db.sqlite" files are completely disposable, they're merely a performance cache and are re-built on the fly, so are safe to delete. I did delete the one for the organizer's calendars and it took longer to process the attempted event delete while it rebuilt the database, but still errored out in the end. FWIW the error is thrown by this code: https://trac.calendarserver.org/browser/CalendarServer/trunk/twistedcaldav/scheduling/implicit.py Any further suggestions? I see lots of questions about this in my Google searches, but not solutions and this is a widespread problem on our iCal Server. Now, we mostly try to get users to ignore them (hence the amount of time this question has been open), but every now and then I dig in deeper trying to find the culprit and/or solution.

    Read the article

  • Software to automate event notifications

    - by user6781
    I maintain a small web site and people can send an email to request information about new events. I would like to automate the process of updating people about new events via email. Is there some easy existing software package to do this? Where I can configure events with a date/description/etc... and have mails get sent automatically? The alternative is to build a MySql database and have some python script in a cron job that does it. I'd like to know if this is a solved problem though. I'm running on a shared host (dreamhost if it matters)

    Read the article

  • NetworkOnMainThread exception occuring

    - by Akshat
    I got a code from Android Hive to parse JSON data from url. Then I am trying to implement the same code on Rotten Tomatoes Upcoming Movies Api. I have implemented the same code with almost modifying all the xml files according to my needs. But the problem is when I am trying to run the code, its showing NetworkOnMainThread Exception. This is my code.. public class Upcoming extends ListActivity { String url = "http://api.rottentomatoes.com/api/public/v1.0/lists/movies/upcoming.json?apikey=yvvgsv722gy2zkey3ebkda5t"; final String TAG_MOVIES = "movies"; final String TAG_ID = "id"; final String TAG_TITLE = "title"; final String TAG_YEAR = "year"; final String TAG_MPAA_RATING = "mpaa_rating"; final String TAG_RUNTIME = "runtime"; final String TAG_RELEASE_DATES = "release_dates"; final String TAG_RATINGS = "ratings"; final String TAG_CRITIC_RATING = "critics_ratings"; final String TAG_AUDIENCE_RATING = "audience_ratings"; final String TAG_SYNOPSIS = "synopsis"; final String TAG_POSTERS = "posters"; JSONArray upcomings = null; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_upcoming_list); ArrayList<HashMap<String, String>> UpcomingList = new ArrayList<HashMap<String, String>>(); // Creating JSON Parser instance JSONParser jParser = new JSONParser(); // getting JSON string from URL JSONObject json = jParser.getJSONFromUrl(url); try { // Getting Array of Contacts upcomings = json.getJSONArray(TAG_MOVIES); // looping through All Contacts for(int i = 0; i < upcomings.length(); i++){ JSONObject c = upcomings.getJSONObject(i); // Storing each json item in variable String id = c.getString(TAG_ID); String title = c.getString(TAG_TITLE); String year = c.getString(TAG_YEAR); String mpaa_rating = c.getString(TAG_MPAA_RATING); String runtime = c.getString(TAG_RUNTIME); JSONObject release_dates = c.getJSONObject(TAG_RELEASE_DATES); JSONObject ratings = c.getJSONObject(TAG_RATINGS); String critic_rating = c.getString(TAG_CRITIC_RATING); String audience_rating = c.getString(TAG_AUDIENCE_RATING); String synopsis = c.getString(TAG_SYNOPSIS); JSONObject posters = c.getJSONObject(TAG_POSTERS); HashMap<String, String> map = new HashMap<String, String>(); map.put(TAG_TITLE, title); map.put(TAG_YEAR, year); map.put(TAG_CRITIC_RATING, critic_rating); map.put(TAG_AUDIENCE_RATING, audience_rating); UpcomingList.add(map); } } catch (JSONException e) { e.printStackTrace(); } ListAdapter adapter = new SimpleAdapter(this, UpcomingList, R.layout.activity_upcoming, new String[] { TAG_TITLE, TAG_YEAR, TAG_CRITIC_RATING, TAG_AUDIENCE_RATING }, new int[] { R.id.title, R.id.year, R.id.critic_rating, R.id.audience_rating }); setListAdapter(adapter); // selecting single ListView item ListView lv = getListView(); // Launching new screen on Selecting Single ListItem lv.setOnItemClickListener(new OnItemClickListener() { @Override public void onItemClick(AdapterView<?> parent, View view, int position, long id) { // getting values from selected ListItem String name = ((TextView) view.findViewById(R.id.title)).getText().toString(); String cost = ((TextView) view.findViewById(R.id.year)).getText().toString(); String critic_rating = ((TextView) view.findViewById(R.id.critic_rating)).getText().toString(); String audience_rating = ((TextView) view.findViewById(R.id.audience_rating)).getText().toString(); // Starting new intent Intent in = new Intent(getApplicationContext(), Upcoming.class); in.putExtra(TAG_TITLE, name); in.putExtra(TAG_YEAR, cost); in.putExtra(TAG_CRITIC_RATING, critic_rating); in.putExtra(TAG_AUDIENCE_RATING, audience_rating); startActivity(in); } }); } } Can anyone please help me with anything I am missing.? I am totally blind on it now. Thanx in advance.

    Read the article

  • Muti-Schema Privileges for a Table Trigger in an Oracle Database

    - by sisslack
    I'm trying to write a table trigger which queries another table that is outside the schema where the trigger will reside. Is this possible? It seems like I have no problem querying tables in my schema but I get: Error: ORA-00942: table or view does not exist when trying trying to query tables outside my schema. EDIT My apologies for not providing as much information as possible the first time around. I was under the impression this question was more simple. I'm trying create a trigger on a table that changes some fields on a newly inserted row based on the existence of some data that may or may not be in a table that is in another schema. The user account that I'm using to create the trigger does have the permissions to run the queries independently. In fact, I've had my trigger print the query I'm trying to run and was able to run it on it's own successfully. I should also note that I'm building the query dynamically by using the EXECUTE IMMEDIATE statement. Here's an example: CREATE OR REPLACE TRIGGER MAIN_SCHEMA.EVENTS BEFORE INSERT ON MAIN_SCHEMA.EVENTS REFERENCING OLD AS OLD NEW AS NEW FOR EACH ROW DECLARE rtn_count NUMBER := 0; table_name VARCHAR2(17) := :NEW.SOME_FIELD; key_field VARCHAR2(20) := :NEW.ANOTHER_FIELD; BEGIN CASE WHEN (key_field = 'condition_a') THEN EXECUTE IMMEDIATE 'select count(*) from OTHER_SCHEMA_A.'||table_name||' where KEY_FIELD='''||key_field||'''' INTO rtn_count; WHEN (key_field = 'condition_b') THEN EXECUTE IMMEDIATE 'select count(*) from OTHER_SCHEMA_B.'||table_name||' where KEY_FIELD='''||key_field||'''' INTO rtn_count; WHEN (key_field = 'condition_c') THEN EXECUTE IMMEDIATE 'select count(*) from OTHER_SCHEMA_C.'||table_name||' where KEY_FIELD='''||key_field||'''' INTO rtn_count; END CASE; IF (rtn_count > 0) THEN -- change some fields that are to be inserted END IF; END; The trigger seams to fail on the EXECUTE IMMEDIATE with the previously mentioned error. EDIT I have done some more research and I can offer more clarification. The user account I'm using to create this trigger is not MAIN_SCHEMA or any one of the OTHER_SCHEMA_Xs. The account I'm using (ME) is given privileges to the involved tables via the schema users themselves. For example (USER_TAB_PRIVS): GRANTOR GRANTEE TABLE_SCHEMA TABLE_NAME PRIVILEGE GRANTABLE HIERARCHY MAIN_SCHEMA ME MAIN_SCHEMA EVENTS DELETE NO NO MAIN_SCHEMA ME MAIN_SCHEMA EVENTS INSERT NO NO MAIN_SCHEMA ME MAIN_SCHEMA EVENTS SELECT NO NO MAIN_SCHEMA ME MAIN_SCHEMA EVENTS UPDATE NO NO OTHER_SCHEMA_X ME OTHER_SCHEMA_X TARGET_TBL SELECT NO NO And I have the following system privileges (USER_SYS_PRIVS): USERNAME PRIVILEGE ADMIN_OPTION ME ALTER ANY TRIGGER NO ME CREATE ANY TRIGGER NO ME UNLIMITED TABLESPACE NO And this is what I found in the Oracle documentation: To create a trigger in another user's schema, or to reference a table in another schema from a trigger in your schema, you must have the CREATE ANY TRIGGER system privilege. With this privilege, the trigger can be created in any schema and can be associated with any user's table. In addition, the user creating the trigger must also have EXECUTE privilege on the referenced procedures, functions, or packages. Here: Oracle Doc So it looks to me like this should work, but I'm not sure about the "EXECUTE privilege" it's referring to in the doc.

    Read the article

  • Free .NET Training at DevCare in Dallas...

    - by [email protected]
    Come take an early look at the debugging experience in VS 2010 this Friday (3/25/2010) at TekFocus in Dallas, at the InfoMart, at 9 AM: In this session, we’ll … Dive deep into the new IntelliTrace (formerly, historical debugging) feature, which enables you to step back in time within your debugging session and inspect or re-execute code, without having to restart your application See how to manage large numbers of breakpoints with labeling, searching and filtering Extend “data tips” by adding comments, notes and strategically “pinning” these resources to maintain their visibility throughout your session Demonstrate “collaborative debugging,“ by debugging a portion of an application and then exporting breakpoints and labeled data tips, so that others can leverage your effort, without having to start over Leverage these new debugging features in applications built in earlier versions of the .NET Framework through the MultiTargeting features available in VS 2010 You’ll walk-away with a clear understanding of how you can use this upcoming technology to vastly increase your productivity and build better software.Register to attend ==>  http://www.dallasdevcares.com/upcoming-sessions/ DevCares is a monthly series of FREE half-day events sponsored by TekFocus and Microsoft. Targeted specifically at developers, the content is presented by experts on a variety of .NET topics. These briefings include expert testimonials, working demos and sample code designed to help you get the most out of application development with .NET. Events are held on the last Friday of each month at the TekFocus offices in the Infomart near downtown Dallas.TekFocus is a full-service technology training provider with a core business delivering Microsoft-certified technical training and product skills enhancements to customers worldwide    

    Read the article

< Previous Page | 117 118 119 120 121 122 123 124 125 126 127 128  | Next Page >