Search Results

Search found 28016 results on 1121 pages for '! original content'.

Page 177/1121 | < Previous Page | 173 174 175 176 177 178 179 180 181 182 183 184  | Next Page >

  • Working on a jQuery plugin : viewport simulator

    - by pixelboy
    Hi community, i'm working on a jQuery plugin that's aiming to simulate one axis camera movements. The first step to achieving this is simple : use conventional, clean markup, and get the prototype to work. Heres how to initiate the plugin : $('#wrapper').fluidGrids({exclude: '.exclude'}); Here's a working demo of the WIP thing : http://sandbox.test.everestconseil.com/protoCitroen/home.html Where am I having issues : Is there a fast way to detect if parent of each target (clickable animated element) is a href link, and if so to save this url ? My clone uses the original background image to then animate it.fade it to black/white. But when you clik on an element, image url is found, but doesn't seem to be injected. Do you see anything ? Finally, about animation of elements : as you can see in source code, I'm using the container $('#wrapper') to position all animated children. What would be the perfect properties to apply to make this cross browser proof ? Here's the source code for the plugin, fully commented. (function($){ $.fn.extend({ //plugin name - fluidGrids fluidGrids: function(options) { //List and default values for available options var defaults = { //The target that we're going to use to handle click event hitTarget: '.animateMe', exclude: '.exclude', animateSpeed: 1000 }; var options = $.extend(defaults, options); return this.each(function() { var o = options; //Assign current element to variable var obj = $(this); //We assign default width height and positions to each object in order to get them back when necessary var objPosition = obj.position(); //Get all ready to animate targets in innerViewport var items = $(o.hitTarget, obj); //Final coords of innerViewport var innerViewport = new Object(); innerViewport.top = parseInt(objPosition.top); innerViewport.left = parseInt(objPosition.left); innerViewport.bottom = obj.height(); innerViewport.right = obj.width(); items.each(function(e){ //Assign a pointer cursor to each clickable element $(this).css("cursor","pointer"); //To load distant url at last, we look for it in Title Attribute if ($(this).attr('title').length){ var targetUrl = $(this).attr('title'); } //We assign default width height and positions to each object in order to get them back when necessary var itemPosition = $(this).position(); var itemTop = itemPosition.top; var itemLeft = itemPosition.left; var itemWidth = $(this).width(); var itemHeight = $(this).height(); //Both the original and it's animated clone var original = $(this); //To give the if (original.css('background-image')){ var urlImageOriginal = original.css('background-image').replace(/^url|[("")]/g, ''); var imageToInsert = "<img src="+urlImageOriginal+"/>" } var clone = $(this).clone(); original.click(function() { $(clone).append(imageToInsert); $(clone).attr("id","clone"); $(clone).attr('top',itemTop); $(clone).attr('left',itemLeft); $(clone).css("position","absolute"); $(clone).insertAfter(this); $(this).hide(); $(clone).animate({ top: innerViewport.top, left: innerViewport.left, width: innerViewport.bottom, height: innerViewport.right }, obj.animateSpeed); $("*",obj).not("#clone, #clone * , a , "+ o.exclude).fadeOut('fast'); //Si l'objet du click est un lien return false; }); }); }); } }); })(jQuery);

    Read the article

  • Refactoring a leaf class to a base class, and keeping it also a interface implementation

    - by elcuco
    I am trying to refactor a working code. The code basically derives an interface class into a working implementation, and I want to use this implementation outside the original project as a standalone class. However, I do not want to create a fork, and I want the original project to be able to take out their implementation, and use mine. The problem is that the hierarchy structure is very different and I am not sure if this would work. I also cannot use the original base class in my project, since in reality it's quite entangled in the project (too many classes, includes) and I need to take care of only a subdomain of the problems the original project is. I wrote this code to test an idea how to implement this, and while it's working, I am not sure I like it: #include <iostream> // Original code is: // IBase -> Derived1 // I need to refactor Derive2 to be both indipendet class // and programmers should also be able to use the interface class // Derived2 -> MyClass + IBase // MyClass class IBase { public: virtual void printMsg() = 0; }; /////////////////////////////////////////////////// class Derived1 : public IBase { public: virtual void printMsg(){ std::cout << "Hello from Derived 1" << std::endl; } }; ////////////////////////////////////////////////// class MyClass { public: virtual void printMsg(){ std::cout << "Hello from MyClass" << std::endl; } }; class Derived2: public IBase, public MyClass{ virtual void printMsg(){ MyClass::printMsg(); } }; class Derived3: public MyClass, public IBase{ virtual void printMsg(){ MyClass::printMsg(); } }; int main() { IBase *o1 = new Derived1(); IBase *o2 = new Derived2(); IBase *o3 = new Derived3(); MyClass *o4 = new MyClass(); o1->printMsg(); o2->printMsg(); o3->printMsg(); o4->printMsg(); return 0; } The output is working as expected (tested using gcc and clang, 2 different C++ implementations so I think I am safe here): [elcuco@pinky ~/src/googlecode/qtedit4/tools/qtsourceview/qate/tests] ./test1 Hello from Derived 1 Hello from MyClass Hello from MyClass Hello from MyClass [elcuco@pinky ~/src/googlecode/qtedit4/tools/qtsourceview/qate/tests] ./test1.clang Hello from Derived 1 Hello from MyClass Hello from MyClass Hello from MyClass The question is My original code was: class Derived3: public MyClass, public IBase{ virtual void IBase::printMsg(){ MyClass::printMsg(); } }; Which is what I want to express, but this does not compile. I must admit I do not fully understand why this code work, as I expect that the new method Derived3::printMsg() will be an implementation of MyClass::printMsg() and not IBase::printMsg() (even tough this is what I do want). How does the compiler chooses which method to re-implement, when two "sister classes" have the same virtual function name? If anyone has a better way of implementing this, I would like to know as well :)

    Read the article

  • My ASP.NET news sources

    - by Jon Galloway
    I just posted about the ASP.NET Daily Community Spotlight. I was going to list a bunch of my news sources at the end, but figured this deserves a separate post. I've been following a lot of development blogs for a long time - for a while I subscribed to over 1500 feeds and read them all. That doesn't scale very well, though, and it's really time consuming. Since the community spotlight requires an interesting ASP.NET post every day of the year, I've come up with a few sources of ASP.NET news. Top Link Blogs Chris Alcock's The Morning Brew is a must-read blog which highlights each day's best blog posts across the .NET community. He covers the entire Microsoft development, but generally any of the top ASP.NET posts I see either have already been listed on The Morning Brew or will be there soon. Elijah Manor posts a lot of great content, which is available in his Twitter feed at @elijahmanor, on his Delicious feed, and on a dedicated website - Web Dev Tweets. While not 100% ASP.NET focused, I've been appreciating Joe Stagner's Weekly Links series, partly since he includes a lot of links that don't show up on my other lists. Twitter Over the past few years, I've been getting more and more of my information from my Twitter network (as opposed to RSS or other means). Twitter is as good as your network, so if getting good information off Twitter sounds crazy, you're probably not following the right people. I already mentioned Elijah Manor (@elijahmanor). I follow over a thousand people on Twitter, so I'm not going to try to pick and choose a list, but one good way to get started building out a Twitter network is to follow active Twitter users on the ASP.NET team at Microsoft: @scottgu (well, not on the ASP.NET team, but their great grand boss, and always a great source of ASP.NET info) @shanselman @haacked @bradwilson @davidfowl @InfinitiesLoop @davidebbo @marcind @DamianEdwards @stevensanderson @bleroy @humancompiler @osbornm @anurse I'm sure I'm missing a few, and I'll update the list. Building a Twitter network that follows topics you're interested in allows you to use other tools like Cadmus to automatically summarize top content by leveraging the collective input of many users. Twitter Search with Topsy You can search Twitter for hashtags (like #aspnet, #aspnetmvc, and #webmatrix) to get a raw view of what people are talking about on Twitter. Twitter's search is pretty poor; I prefer Topsy. Here's an example search for the #aspnetmvc hashtag: http://topsy.com/s?q=%23aspnetmvc You can also do combined queries for several tags: http://topsy.com/s?q=%23aspnetmvc+OR+%23aspnet+OR+%23webmatrix Paper.li Paper.li is a handy service that builds a custom daily newspaper based on your social network. They've turned a lot of people off by automatically tweeting "The SuperDevFoo Daily is out!!!" messages (which can be turned off), but if you're ignoring them because of those message, you're missing out on a handy, free service. My paper.li page includes content across a lot of interests, including ASP.NET: http://paper.li/jongalloway When I want to drill into a specific tag, though, I'll just look at the Paper.li post for that hashtag. For example, here's the #aspnetmvc paper.li page: http://paper.li/tag/aspnetmvc Delicious I mentioned previously that I use Delicious for managing site links. I also use their network and search features. The tag based search is pretty good: Even better, though, is that I can see who's bookmarked these links, and add them to my Delicious network. After having built out a network, I can optimize by doing less searching and more leaching leveraging of collective intelligence. Community Sites I scan DotNetKicks, the weblogs.asp.net combined feed, and the ASP.NET Community page, CodeBetter, Los Techies,  CodeProject,  and DotNetSlackers from time to time. They're hit and miss, but they do offer more of an opportunity for finding original content which others may have missed. Terms of Enrampagement When someone's on a tear, I just manually check their sites more often. I could use RSS for that, but it changes pretty often. I just keep a mental note of people who are cranking out a lot of good content and check their sites more often. What works for you?

    Read the article

  • virtualbox, MAAS: help needed

    - by Roberto Attias
    Ok, I made some progress wrt the original question (still below). I found /etc/maas/dhcpd.conf contained option domain-name-servers 10.0.3.15, and changed it to 192.168.0.11. After restarting the daemon, I now see "node" getting the right DNS, unfortunately this doesn't fix the main problem, which I believe is the reference to 169.254.169.254. It does introduce a new question: while the remaining information from /etc/maas/dhcp.conf is present in the maas GUI, there is no field to enter the dns address. Why? Anyway, my original problem still stands... Any idea? Original question follows. In VirtualBox, I have: master VM: ubuntu 12.04.3 server eth0: Internal Network, IP= 192.168.0.11 eth1: NAT, IP= 10.0.3.15 eth2: Host-only, IP= 192.168.56.102 running MAAS region and cluster controlller, with DHCP and DNS enabled node VM: eth0: Internal Network node VM boots in PXEboot. DHCP succeeds, and the boot process starts, but during boot I see some issues. One of them is "disk drive not ready yet or not present" for / and /tmp. I've googled this issue, and some people say it happens when the fisical disk is a SSD, which is my case. Anywaythe system seems to recover from this eventually. Immediately after it starts printing a lot of messages of the form: 2013-10-01 16:52:37,142 - url_helper.py[WARNING]: Calling 'http://169.254.168.254/2009-04-04/meta-data/instance-id failed [x/y]: url error [[Errno 113] No route to host] That IP address is clearly bogous, not sure where it came from. Before that point, I had seen the following network configuration: address: 192.168.0.100 broadcast: 192.168.0.255 netmask: 255.255.255.0 gateway: 192.168.0.1 dns0 : 10.0.3.15 dns1 : 0.0.0.0 Not sure if related, but the dns doesn't seem right, as node doesn't have an interface to reach 10.0.3.15. If that's the problem, what should I change to have the DNS point to 192.168.0.11? Thanks, Roberto

    Read the article

  • Silverlight Recruiting Application Part 5 - Jobs Module / View

    Now we starting getting into a more code-heavy portion of this series, thankfully though this means the groundwork is all set for the most part and after adding the modules we will have a complete application that can be provided with full source. The Jobs module will have two concerns- adding and maintaining jobs that can then be broadcast out to the website. How they are displayed on the site will be handled by our admin system (which will just poll from this common database), so we aren't too concerned with that, but rather with getting the information into the system and allowing the backend administration/HR users to keep things up to date. Since there is a fair bit of information that we want to display, we're going to move editing to a separate view so we can get all that information in an easy-to-use spot. With all the files created for this module, the project looks something like this: And now... on to the code. XAML for the Job Posting View All we really need for the Job Posting View is a RadGridView and a few buttons. This will let us both show off records and perform operations on the records without much hassle. That XAML is going to look something like this: 01.<Grid x:Name="LayoutRoot" 02.Background="White"> 03.<Grid.RowDefinitions> 04.<RowDefinition Height="30" /> 05.<RowDefinition /> 06.</Grid.RowDefinitions> 07.<StackPanel Orientation="Horizontal"> 08.<Button x:Name="xAddRecordButton" 09.Content="Add Job" 10.Width="120" 11.cal:Click.Command="{Binding AddRecord}" 12.telerik:StyleManager.Theme="Windows7" /> 13.<Button x:Name="xEditRecordButton" 14.Content="Edit Job" 15.Width="120" 16.cal:Click.Command="{Binding EditRecord}" 17.telerik:StyleManager.Theme="Windows7" /> 18.</StackPanel> 19.<telerikGrid:RadGridView x:Name="xJobsGrid" 20.Grid.Row="1" 21.IsReadOnly="True" 22.AutoGenerateColumns="False" 23.ColumnWidth="*" 24.RowDetailsVisibilityMode="VisibleWhenSelected" 25.ItemsSource="{Binding MyJobs}" 26.SelectedItem="{Binding SelectedJob, Mode=TwoWay}" 27.command:SelectedItemChangedEventClass.Command="{Binding SelectedItemChanged}"> 28.<telerikGrid:RadGridView.Columns> 29.<telerikGrid:GridViewDataColumn Header="Job Title" 30.DataMemberBinding="{Binding JobTitle}" 31.UniqueName="JobTitle" /> 32.<telerikGrid:GridViewDataColumn Header="Location" 33.DataMemberBinding="{Binding Location}" 34.UniqueName="Location" /> 35.<telerikGrid:GridViewDataColumn Header="Resume Required" 36.DataMemberBinding="{Binding NeedsResume}" 37.UniqueName="NeedsResume" /> 38.<telerikGrid:GridViewDataColumn Header="CV Required" 39.DataMemberBinding="{Binding NeedsCV}" 40.UniqueName="NeedsCV" /> 41.<telerikGrid:GridViewDataColumn Header="Overview Required" 42.DataMemberBinding="{Binding NeedsOverview}" 43.UniqueName="NeedsOverview" /> 44.<telerikGrid:GridViewDataColumn Header="Active" 45.DataMemberBinding="{Binding IsActive}" 46.UniqueName="IsActive" /> 47.</telerikGrid:RadGridView.Columns> 48.</telerikGrid:RadGridView> 49.</Grid> I'll explain what's happening here by line numbers: Lines 11 and 16: Using the same type of click commands as we saw in the Menu module, we tie the button clicks to delegate commands in the viewmodel. Line 25: The source for the jobs will be a collection in the viewmodel. Line 26: We also bind the selected item to a public property from the viewmodel for use in code. Line 27: We've turned the event into a command so we can handle it via code in the viewmodel. So those first three probably make sense to you as far as Silverlight/WPF binding magic is concerned, but for line 27... This actually comes from something I read onDamien Schenkelman's blog back in the day for creating an attached behavior from any event. So, any time you see me using command:Whatever.Command, the backing for it is actually something like this: SelectedItemChangedEventBehavior.cs: 01.public class SelectedItemChangedEventBehavior : CommandBehaviorBase<Telerik.Windows.Controls.DataControl> 02.{ 03.public SelectedItemChangedEventBehavior(DataControl element) 04.: base(element) 05.{ 06.element.SelectionChanged += new EventHandler<SelectionChangeEventArgs>(element_SelectionChanged); 07.} 08.void element_SelectionChanged(object sender, SelectionChangeEventArgs e) 09.{ 10.// We'll only ever allow single selection, so will only need item index 0 11.base.CommandParameter = e.AddedItems[0]; 12.base.ExecuteCommand(); 13.} 14.} SelectedItemChangedEventClass.cs: 01.public class SelectedItemChangedEventClass 02.{ 03.#region The Command Stuff 04.public static ICommand GetCommand(DependencyObject obj) 05.{ 06.return (ICommand)obj.GetValue(CommandProperty); 07.} 08.public static void SetCommand(DependencyObject obj, ICommand value) 09.{ 10.obj.SetValue(CommandProperty, value); 11.} 12.public static readonly DependencyProperty CommandProperty = 13.DependencyProperty.RegisterAttached("Command", typeof(ICommand), 14.typeof(SelectedItemChangedEventClass), new PropertyMetadata(OnSetCommandCallback)); 15.public static void OnSetCommandCallback(DependencyObject dependencyObject, DependencyPropertyChangedEventArgs e) 16.{ 17.DataControl element = dependencyObject as DataControl; 18.if (element != null) 19.{ 20.SelectedItemChangedEventBehavior behavior = GetOrCreateBehavior(element); 21.behavior.Command = e.NewValue as ICommand; 22.} 23.} 24.#endregion 25.public static SelectedItemChangedEventBehavior GetOrCreateBehavior(DataControl element) 26.{ 27.SelectedItemChangedEventBehavior behavior = element.GetValue(SelectedItemChangedEventBehaviorProperty) as SelectedItemChangedEventBehavior; 28.if (behavior == null) 29.{ 30.behavior = new SelectedItemChangedEventBehavior(element); 31.element.SetValue(SelectedItemChangedEventBehaviorProperty, behavior); 32.} 33.return behavior; 34.} 35.public static SelectedItemChangedEventBehavior GetSelectedItemChangedEventBehavior(DependencyObject obj) 36.{ 37.return (SelectedItemChangedEventBehavior)obj.GetValue(SelectedItemChangedEventBehaviorProperty); 38.} 39.public static void SetSelectedItemChangedEventBehavior(DependencyObject obj, SelectedItemChangedEventBehavior value) 40.{ 41.obj.SetValue(SelectedItemChangedEventBehaviorProperty, value); 42.} 43.public static readonly DependencyProperty SelectedItemChangedEventBehaviorProperty = 44.DependencyProperty.RegisterAttached("SelectedItemChangedEventBehavior", 45.typeof(SelectedItemChangedEventBehavior), typeof(SelectedItemChangedEventClass), null); 46.} These end up looking very similar from command to command, but in a nutshell you create a command based on any event, determine what the parameter for it will be, then execute. It attaches via XAML and ties to a DelegateCommand in the viewmodel, so you get the full event experience (since some controls get a bit event-rich for added functionality). Simple enough, right? Viewmodel for the Job Posting View The Viewmodel is going to need to handle all events going back and forth, maintaining interactions with the data we are using, and both publishing and subscribing to events. Rather than breaking this into tons of little pieces, I'll give you a nice view of the entire viewmodel and then hit up the important points line-by-line: 001.public class JobPostingViewModel : ViewModelBase 002.{ 003.private readonly IEventAggregator eventAggregator; 004.private readonly IRegionManager regionManager; 005.public DelegateCommand<object> AddRecord { get; set; } 006.public DelegateCommand<object> EditRecord { get; set; } 007.public DelegateCommand<object> SelectedItemChanged { get; set; } 008.public RecruitingContext context; 009.private QueryableCollectionView _myJobs; 010.public QueryableCollectionView MyJobs 011.{ 012.get { return _myJobs; } 013.} 014.private QueryableCollectionView _selectionJobActionHistory; 015.public QueryableCollectionView SelectedJobActionHistory 016.{ 017.get { return _selectionJobActionHistory; } 018.} 019.private JobPosting _selectedJob; 020.public JobPosting SelectedJob 021.{ 022.get { return _selectedJob; } 023.set 024.{ 025.if (value != _selectedJob) 026.{ 027._selectedJob = value; 028.NotifyChanged("SelectedJob"); 029.} 030.} 031.} 032.public SubscriptionToken editToken = new SubscriptionToken(); 033.public SubscriptionToken addToken = new SubscriptionToken(); 034.public JobPostingViewModel(IEventAggregator eventAgg, IRegionManager regionmanager) 035.{ 036.// set Unity items 037.this.eventAggregator = eventAgg; 038.this.regionManager = regionmanager; 039.// load our context 040.context = new RecruitingContext(); 041.this._myJobs = new QueryableCollectionView(context.JobPostings); 042.context.Load(context.GetJobPostingsQuery()); 043.// set command events 044.this.AddRecord = new DelegateCommand<object>(this.AddNewRecord); 045.this.EditRecord = new DelegateCommand<object>(this.EditExistingRecord); 046.this.SelectedItemChanged = new DelegateCommand<object>(this.SelectedRecordChanged); 047.SetSubscriptions(); 048.} 049.#region DelegateCommands from View 050.public void AddNewRecord(object obj) 051.{ 052.this.eventAggregator.GetEvent<AddJobEvent>().Publish(true); 053.} 054.public void EditExistingRecord(object obj) 055.{ 056.if (_selectedJob == null) 057.{ 058.this.eventAggregator.GetEvent<NotifyUserEvent>().Publish("No job selected."); 059.} 060.else 061.{ 062.this._myJobs.EditItem(this._selectedJob); 063.this.eventAggregator.GetEvent<EditJobEvent>().Publish(this._selectedJob); 064.} 065.} 066.public void SelectedRecordChanged(object obj) 067.{ 068.if (obj.GetType() == typeof(ActionHistory)) 069.{ 070.// event bubbles up so we don't catch items from the ActionHistory grid 071.} 072.else 073.{ 074.JobPosting job = obj as JobPosting; 075.GrabHistory(job.PostingID); 076.} 077.} 078.#endregion 079.#region Subscription Declaration and Events 080.public void SetSubscriptions() 081.{ 082.EditJobCompleteEvent editComplete = eventAggregator.GetEvent<EditJobCompleteEvent>(); 083.if (editToken != null) 084.editComplete.Unsubscribe(editToken); 085.editToken = editComplete.Subscribe(this.EditCompleteEventHandler); 086.AddJobCompleteEvent addComplete = eventAggregator.GetEvent<AddJobCompleteEvent>(); 087.if (addToken != null) 088.addComplete.Unsubscribe(addToken); 089.addToken = addComplete.Subscribe(this.AddCompleteEventHandler); 090.} 091.public void EditCompleteEventHandler(bool complete) 092.{ 093.if (complete) 094.{ 095.JobPosting thisJob = _myJobs.CurrentEditItem as JobPosting; 096.this._myJobs.CommitEdit(); 097.this.context.SubmitChanges((s) => 098.{ 099.ActionHistory myAction = new ActionHistory(); 100.myAction.PostingID = thisJob.PostingID; 101.myAction.Description = String.Format("Job '{0}' has been edited by {1}", thisJob.JobTitle, "default user"); 102.myAction.TimeStamp = DateTime.Now; 103.eventAggregator.GetEvent<AddActionEvent>().Publish(myAction); 104.} 105., null); 106.} 107.else 108.{ 109.this._myJobs.CancelEdit(); 110.} 111.this.MakeMeActive(this.regionManager, "MainRegion", "JobPostingsView"); 112.} 113.public void AddCompleteEventHandler(JobPosting job) 114.{ 115.if (job == null) 116.{ 117.// do nothing, new job add cancelled 118.} 119.else 120.{ 121.this.context.JobPostings.Add(job); 122.this.context.SubmitChanges((s) => 123.{ 124.ActionHistory myAction = new ActionHistory(); 125.myAction.PostingID = job.PostingID; 126.myAction.Description = String.Format("Job '{0}' has been added by {1}", job.JobTitle, "default user"); 127.myAction.TimeStamp = DateTime.Now; 128.eventAggregator.GetEvent<AddActionEvent>().Publish(myAction); 129.} 130., null); 131.} 132.this.MakeMeActive(this.regionManager, "MainRegion", "JobPostingsView"); 133.} 134.#endregion 135.public void GrabHistory(int postID) 136.{ 137.context.ActionHistories.Clear(); 138._selectionJobActionHistory = new QueryableCollectionView(context.ActionHistories); 139.context.Load(context.GetHistoryForJobQuery(postID)); 140.} Taking it from the top, we're injecting an Event Aggregator and Region Manager for use down the road and also have the public DelegateCommands (just like in the Menu module). We also grab a reference to our context, which we'll obviously need for data, then set up a few fields with public properties tied to them. We're also setting subscription tokens, which we have not yet seen but I will get into below. The AddNewRecord (50) and EditExistingRecord (54) methods should speak for themselves for functionality, the one thing of note is we're sending events off to the Event Aggregator which some module, somewhere will take care of. Since these aren't entirely relying on one another, the Jobs View doesn't care if anyone is listening, but it will publish AddJobEvent (52), NotifyUserEvent (58) and EditJobEvent (63)regardless. Don't mind the GrabHistory() method so much, that is just grabbing history items (visibly being created in the SubmitChanges callbacks), and adding them to the database. Every action will trigger a history event, so we'll know who modified what and when, just in case. ;) So where are we at? Well, if we click to Add a job, we publish an event, if we edit a job, we publish an event with the selected record (attained through the magic of binding). Where is this all going though? To the Viewmodel, of course! XAML for the AddEditJobView This is pretty straightforward except for one thing, noted below: 001.<Grid x:Name="LayoutRoot" 002.Background="White"> 003.<Grid x:Name="xEditGrid" 004.Margin="10" 005.validationHelper:ValidationScope.Errors="{Binding Errors}"> 006.<Grid.Background> 007.<LinearGradientBrush EndPoint="0.5,1" 008.StartPoint="0.5,0"> 009.<GradientStop Color="#FFC7C7C7" 010.Offset="0" /> 011.<GradientStop Color="#FFF6F3F3" 012.Offset="1" /> 013.</LinearGradientBrush> 014.</Grid.Background> 015.<Grid.RowDefinitions> 016.<RowDefinition Height="40" /> 017.<RowDefinition Height="40" /> 018.<RowDefinition Height="40" /> 019.<RowDefinition Height="100" /> 020.<RowDefinition Height="100" /> 021.<RowDefinition Height="100" /> 022.<RowDefinition Height="40" /> 023.<RowDefinition Height="40" /> 024.<RowDefinition Height="40" /> 025.</Grid.RowDefinitions> 026.<Grid.ColumnDefinitions> 027.<ColumnDefinition Width="150" /> 028.<ColumnDefinition Width="150" /> 029.<ColumnDefinition Width="300" /> 030.<ColumnDefinition Width="100" /> 031.</Grid.ColumnDefinitions> 032.<!-- Title --> 033.<TextBlock Margin="8" 034.Text="{Binding AddEditString}" 035.TextWrapping="Wrap" 036.Grid.Column="1" 037.Grid.ColumnSpan="2" 038.FontSize="16" /> 039.<!-- Data entry area--> 040. 041.<TextBlock Margin="8,0,0,0" 042.Style="{StaticResource LabelTxb}" 043.Grid.Row="1" 044.Text="Job Title" 045.VerticalAlignment="Center" /> 046.<TextBox x:Name="xJobTitleTB" 047.Margin="0,8" 048.Grid.Column="1" 049.Grid.Row="1" 050.Text="{Binding activeJob.JobTitle, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 051.Grid.ColumnSpan="2" /> 052.<TextBlock Margin="8,0,0,0" 053.Grid.Row="2" 054.Text="Location" 055.d:LayoutOverrides="Height" 056.VerticalAlignment="Center" /> 057.<TextBox x:Name="xLocationTB" 058.Margin="0,8" 059.Grid.Column="1" 060.Grid.Row="2" 061.Text="{Binding activeJob.Location, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 062.Grid.ColumnSpan="2" /> 063. 064.<TextBlock Margin="8,11,8,0" 065.Grid.Row="3" 066.Text="Description" 067.TextWrapping="Wrap" 068.VerticalAlignment="Top" /> 069. 070.<TextBox x:Name="xDescriptionTB" 071.Height="84" 072.TextWrapping="Wrap" 073.ScrollViewer.VerticalScrollBarVisibility="Auto" 074.Grid.Column="1" 075.Grid.Row="3" 076.Text="{Binding activeJob.Description, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 077.Grid.ColumnSpan="2" /> 078.<TextBlock Margin="8,11,8,0" 079.Grid.Row="4" 080.Text="Requirements" 081.TextWrapping="Wrap" 082.VerticalAlignment="Top" /> 083. 084.<TextBox x:Name="xRequirementsTB" 085.Height="84" 086.TextWrapping="Wrap" 087.ScrollViewer.VerticalScrollBarVisibility="Auto" 088.Grid.Column="1" 089.Grid.Row="4" 090.Text="{Binding activeJob.Requirements, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 091.Grid.ColumnSpan="2" /> 092.<TextBlock Margin="8,11,8,0" 093.Grid.Row="5" 094.Text="Qualifications" 095.TextWrapping="Wrap" 096.VerticalAlignment="Top" /> 097. 098.<TextBox x:Name="xQualificationsTB" 099.Height="84" 100.TextWrapping="Wrap" 101.ScrollViewer.VerticalScrollBarVisibility="Auto" 102.Grid.Column="1" 103.Grid.Row="5" 104.Text="{Binding activeJob.Qualifications, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 105.Grid.ColumnSpan="2" /> 106.<!-- Requirements Checkboxes--> 107. 108.<CheckBox x:Name="xResumeRequiredCB" Margin="8,8,8,15" 109.Content="Resume Required" 110.Grid.Row="6" 111.Grid.ColumnSpan="2" 112.IsChecked="{Binding activeJob.NeedsResume, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}"/> 113. 114.<CheckBox x:Name="xCoverletterRequiredCB" Margin="8,8,8,15" 115.Content="Cover Letter Required" 116.Grid.Column="2" 117.Grid.Row="6" 118.IsChecked="{Binding activeJob.NeedsCV, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}"/> 119. 120.<CheckBox x:Name="xOverviewRequiredCB" Margin="8,8,8,15" 121.Content="Overview Required" 122.Grid.Row="7" 123.Grid.ColumnSpan="2" 124.IsChecked="{Binding activeJob.NeedsOverview, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}"/> 125. 126.<CheckBox x:Name="xJobActiveCB" Margin="8,8,8,15" 127.Content="Job is Active" 128.Grid.Column="2" 129.Grid.Row="7" 130.IsChecked="{Binding activeJob.IsActive, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}"/> 131. 132.<!-- Buttons --> 133. 134.<Button x:Name="xAddEditButton" Margin="8,8,0,10" 135.Content="{Binding AddEditButtonString}" 136.cal:Click.Command="{Binding AddEditCommand}" 137.Grid.Column="2" 138.Grid.Row="8" 139.HorizontalAlignment="Left" 140.Width="125" 141.telerik:StyleManager.Theme="Windows7" /> 142. 143.<Button x:Name="xCancelButton" HorizontalAlignment="Right" 144.Content="Cancel" 145.cal:Click.Command="{Binding CancelCommand}" 146.Margin="0,8,8,10" 147.Width="125" 148.Grid.Column="2" 149.Grid.Row="8" 150.telerik:StyleManager.Theme="Windows7" /> 151.</Grid> 152.</Grid> The 'validationHelper:ValidationScope' line may seem odd. This is a handy little trick for catching current and would-be validation errors when working in this whole setup. This all comes from an approach found on theJoy Of Code blog, although it looks like the story for this will be changing slightly with new advances in SL4/WCF RIA Services, so this section can definitely get an overhaul a little down the road. The code is the fun part of all this, so let us see what's happening under the hood. Viewmodel for the AddEditJobView We are going to see some of the same things happening here, so I'll skip over the repeat info and get right to the good stuff: 001.public class AddEditJobViewModel : ViewModelBase 002.{ 003.private readonly IEventAggregator eventAggregator; 004.private readonly IRegionManager regionManager; 005. 006.public RecruitingContext context; 007. 008.private JobPosting _activeJob; 009.public JobPosting activeJob 010.{ 011.get { return _activeJob; } 012.set 013.{ 014.if (_activeJob != value) 015.{ 016._activeJob = value; 017.NotifyChanged("activeJob"); 018.} 019.} 020.} 021. 022.public bool isNewJob; 023. 024.private string _addEditString; 025.public string AddEditString 026.{ 027.get { return _addEditString; } 028.set 029.{ 030.if (_addEditString != value) 031.{ 032._addEditString = value; 033.NotifyChanged("AddEditString"); 034.} 035.} 036.} 037. 038.private string _addEditButtonString; 039.public string AddEditButtonString 040.{ 041.get { return _addEditButtonString; } 042.set 043.{ 044.if (_addEditButtonString != value) 045.{ 046._addEditButtonString = value; 047.NotifyChanged("AddEditButtonString"); 048.} 049.} 050.} 051. 052.public SubscriptionToken addJobToken = new SubscriptionToken(); 053.public SubscriptionToken editJobToken = new SubscriptionToken(); 054. 055.public DelegateCommand<object> AddEditCommand { get; set; } 056.public DelegateCommand<object> CancelCommand { get; set; } 057. 058.private ObservableCollection<ValidationError> _errors = new ObservableCollection<ValidationError>(); 059.public ObservableCollection<ValidationError> Errors 060.{ 061.get { return _errors; } 062.} 063. 064.private ObservableCollection<ValidationResult> _valResults = new ObservableCollection<ValidationResult>(); 065.public ObservableCollection<ValidationResult> ValResults 066.{ 067.get { return this._valResults; } 068.} 069. 070.public AddEditJobViewModel(IEventAggregator eventAgg, IRegionManager regionmanager) 071.{ 072.// set Unity items 073.this.eventAggregator = eventAgg; 074.this.regionManager = regionmanager; 075. 076.context = new RecruitingContext(); 077. 078.AddEditCommand = new DelegateCommand<object>(this.AddEditJobCommand); 079.CancelCommand = new DelegateCommand<object>(this.CancelAddEditCommand); 080. 081.SetSubscriptions(); 082.} 083. 084.#region Subscription Declaration and Events 085. 086.public void SetSubscriptions() 087.{ 088.AddJobEvent addJob = this.eventAggregator.GetEvent<AddJobEvent>(); 089. 090.if (addJobToken != null) 091.addJob.Unsubscribe(addJobToken); 092. 093.addJobToken = addJob.Subscribe(this.AddJobEventHandler); 094. 095.EditJobEvent editJob = this.eventAggregator.GetEvent<EditJobEvent>(); 096. 097.if (editJobToken != null) 098.editJob.Unsubscribe(editJobToken); 099. 100.editJobToken = editJob.Subscribe(this.EditJobEventHandler); 101.} 102. 103.public void AddJobEventHandler(bool isNew) 104.{ 105.this.activeJob = null; 106.this.activeJob = new JobPosting(); 107.this.activeJob.IsActive = true; // We assume that we want a new job to go up immediately 108.this.isNewJob = true; 109.this.AddEditString = "Add New Job Posting"; 110.this.AddEditButtonString = "Add Job"; 111. 112.MakeMeActive(this.regionManager, "MainRegion", "AddEditJobView"); 113.} 114. 115.public void EditJobEventHandler(JobPosting editJob) 116.{ 117.this.activeJob = null; 118.this.activeJob = editJob; 119.this.isNewJob = false; 120.this.AddEditString = "Edit Job Posting"; 121.this.AddEditButtonString = "Edit Job"; 122. 123.MakeMeActive(this.regionManager, "MainRegion", "AddEditJobView"); 124.} 125. 126.#endregion 127. 128.#region DelegateCommands from View 129. 130.public void AddEditJobCommand(object obj) 131.{ 132.if (this.Errors.Count > 0) 133.{ 134.List<string> errorMessages = new List<string>(); 135. 136.foreach (var valR in this.Errors) 137.{ 138.errorMessages.Add(valR.Exception.Message); 139.} 140. 141.this.eventAggregator.GetEvent<DisplayValidationErrorsEvent>().Publish(errorMessages); 142. 143.} 144.else if (!Validator.TryValidateObject(this.activeJob, new ValidationContext(this.activeJob, null, null), _valResults, true)) 145.{ 146.List<string> errorMessages = new List<string>(); 147. 148.foreach (var valR in this._valResults) 149.{ 150.errorMessages.Add(valR.ErrorMessage); 151.} 152. 153.this._valResults.Clear(); 154. 155.this.eventAggregator.GetEvent<DisplayValidationErrorsEvent>().Publish(errorMessages); 156.} 157.else 158.{ 159.if (this.isNewJob) 160.{ 161.this.eventAggregator.GetEvent<AddJobCompleteEvent>().Publish(this.activeJob); 162.} 163.else 164.{ 165.this.eventAggregator.GetEvent<EditJobCompleteEvent>().Publish(true); 166.} 167.} 168.} 169. 170.public void CancelAddEditCommand(object obj) 171.{ 172.if (this.isNewJob) 173.{ 174.this.eventAggregator.GetEvent<AddJobCompleteEvent>().Publish(null); 175.} 176.else 177.{ 178.this.eventAggregator.GetEvent<EditJobCompleteEvent>().Publish(false); 179.} 180.} 181. 182.#endregion 183.} 184.} We start seeing something new on line 103- the AddJobEventHandler will create a new job and set that to the activeJob item on the ViewModel. When this is all set, the view calls that familiar MakeMeActive method to activate itself. I made a bit of a management call on making views self-activate like this, but I figured it works for one reason. As I create this application, views may not exist that I have in mind, so after a view receives its 'ping' from being subscribed to an event, it prepares whatever it needs to do and then goes active. This way if I don't have 'edit' hooked up, I can click as the day is long on the main view and won't get lost in an empty region. Total personal preference here. :) Everything else should again be pretty straightforward, although I do a bit of validation checking in the AddEditJobCommand, which can either fire off an event back to the main view/viewmodel if everything is a success or sent a list of errors to our notification module, which pops open a RadWindow with the alerts if any exist. As a bonus side note, here's what my WCF RIA Services metadata looks like for handling all of the validation: private JobPostingMetadata() { } [StringLength(2500, ErrorMessage = "Description should be more than one and less than 2500 characters.", MinimumLength = 1)] [Required(ErrorMessage = "Description is required.")] public string Description; [Required(ErrorMessage="Active Status is Required")] public bool IsActive; [StringLength(100, ErrorMessage = "Posting title must be more than 3 but less than 100 characters.", MinimumLength = 3)] [Required(ErrorMessage = "Job Title is required.")] public bool JobTitle; [Required] public string Location; public bool NeedsCV; public bool NeedsOverview; public bool NeedsResume; public int PostingID; [Required(ErrorMessage="Qualifications are required.")] [StringLength(2500, ErrorMessage="Qualifications should be more than one and less than 2500 characters.", MinimumLength=1)] public string Qualifications; [StringLength(2500, ErrorMessage = "Requirements should be more than one and less than 2500 characters.", MinimumLength = 1)] [Required(ErrorMessage="Requirements are required.")] public string Requirements;   The RecruitCB Alternative See all that Xaml I pasted above? Those are now two pieces sitting in the JobsView.xaml file now. The only real difference is that the xEditGrid now sits in the same place as xJobsGrid, with visibility swapping out between the two for a quick switch. I also took out all the cal: and command: command references and replaced Button events with clicks and the Grid selection command replaced with a SelectedItemChanged event. Also, at the bottom of the xEditGrid after the last button, I add a ValidationSummary (with Visibility=Collapsed) to catch any errors that are popping up. Simple as can be, and leads to this being the single code-behind file: 001.public partial class JobsView : UserControl 002.{ 003.public RecruitingContext context; 004.public JobPosting activeJob; 005.public bool isNew; 006.private ObservableCollection<ValidationResult> _valResults = new ObservableCollection<ValidationResult>(); 007.public ObservableCollection<ValidationResult> ValResults 008.{ 009.get { return this._valResults; } 010.} 011.public JobsView() 012.{ 013.InitializeComponent(); 014.this.Loaded += new RoutedEventHandler(JobsView_Loaded); 015.} 016.void JobsView_Loaded(object sender, RoutedEventArgs e) 017.{ 018.context = new RecruitingContext(); 019.xJobsGrid.ItemsSource = context.JobPostings; 020.context.Load(context.GetJobPostingsQuery()); 021.} 022.private void xAddRecordButton_Click(object sender, RoutedEventArgs e) 023.{ 024.activeJob = new JobPosting(); 025.isNew = true; 026.xAddEditTitle.Text = "Add a Job Posting"; 027.xAddEditButton.Content = "Add"; 028.xEditGrid.DataContext = activeJob; 029.HideJobsGrid(); 030.} 031.private void xEditRecordButton_Click(object sender, RoutedEventArgs e) 032.{ 033.activeJob = xJobsGrid.SelectedItem as JobPosting; 034.isNew = false; 035.xAddEditTitle.Text = "Edit a Job Posting"; 036.xAddEditButton.Content = "Edit"; 037.xEditGrid.DataContext = activeJob; 038.HideJobsGrid(); 039.} 040.private void xAddEditButton_Click(object sender, RoutedEventArgs e) 041.{ 042.if (!Validator.TryValidateObject(this.activeJob, new ValidationContext(this.activeJob, null, null), _valResults, true)) 043.{ 044.List<string> errorMessages = new List<string>(); 045.foreach (var valR in this._valResults) 046.{ 047.errorMessages.Add(valR.ErrorMessage); 048.} 049.this._valResults.Clear(); 050.ShowErrors(errorMessages); 051.} 052.else if (xSummary.Errors.Count > 0) 053.{ 054.List<string> errorMessages = new List<string>(); 055.foreach (var err in xSummary.Errors) 056.{ 057.errorMessages.Add(err.Message); 058.} 059.ShowErrors(errorMessages); 060.} 061.else 062.{ 063.if (this.isNew) 064.{ 065.context.JobPostings.Add(activeJob); 066.context.SubmitChanges((s) => 067.{ 068.ActionHistory thisAction = new ActionHistory(); 069.thisAction.PostingID = activeJob.PostingID; 070.thisAction.Description = String.Format("Job '{0}' has been edited by {1}", activeJob.JobTitle, "default user"); 071.thisAction.TimeStamp = DateTime.Now; 072.context.ActionHistories.Add(thisAction); 073.context.SubmitChanges(); 074.}, null); 075.} 076.else 077.{ 078.context.SubmitChanges((s) => 079.{ 080.ActionHistory thisAction = new ActionHistory(); 081.thisAction.PostingID = activeJob.PostingID; 082.thisAction.Description = String.Format("Job '{0}' has been added by {1}", activeJob.JobTitle, "default user"); 083.thisAction.TimeStamp = DateTime.Now; 084.context.ActionHistories.Add(thisAction); 085.context.SubmitChanges(); 086.}, null); 087.} 088.ShowJobsGrid(); 089.} 090.} 091.private void xCancelButton_Click(object sender, RoutedEventArgs e) 092.{ 093.ShowJobsGrid(); 094.} 095.private void ShowJobsGrid() 096.{ 097.xAddEditRecordButtonPanel.Visibility = Visibility.Visible; 098.xEditGrid.Visibility = Visibility.Collapsed; 099.xJobsGrid.Visibility = Visibility.Visible; 100.} 101.private void HideJobsGrid() 102.{ 103.xAddEditRecordButtonPanel.Visibility = Visibility.Collapsed; 104.xJobsGrid.Visibility = Visibility.Collapsed; 105.xEditGrid.Visibility = Visibility.Visible; 106.} 107.private void ShowErrors(List<string> errorList) 108.{ 109.string nm = "Errors received: \n"; 110.foreach (string anerror in errorList) 111.nm += anerror + "\n"; 112.RadWindow.Alert(nm); 113.} 114.} The first 39 lines should be pretty familiar, not doing anything too unorthodox to get this up and running. Once we hit the xAddEditButton_Click on line 40, we're still doing pretty much the same things except instead of checking the ValidationHelper errors, we both run a check on the current activeJob object as well as check the ValidationSummary errors list. Once that is set, we again use the callback of context.SubmitChanges (lines 68 and 78) to create an ActionHistory which we will use to track these items down the line. That's all? Essentially... yes. If you look back through this post, most of the code and adventures we have taken were just to get things working in the MVVM/Prism setup. Since I have the whole 'module' self-contained in a single JobView+code-behind setup, I don't have to worry about things like sending events off into space for someone to pick up, communicating through an Infrastructure project, or even re-inventing events to be used with attached behaviors. Everything just kinda works, and again with much less code. Here's a picture of the MVVM and Code-behind versions on the Jobs and AddEdit views, but since the functionality is the same in both apps you still cannot tell them apart (for two-strike): Looking ahead, the Applicants module is effectively the same thing as the Jobs module, so most of the code is being cut-and-pasted back and forth with minor tweaks here and there. So that one is being taken care of by me behind the scenes. Next time, we get into a new world of fun- the interview scheduling module, which will pull from available jobs and applicants for each interview being scheduled, tying everything together with RadScheduler to the rescue. Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Changing the action of a hyperlink in a Silverlight RichTextArea

    - by Marc Schluper
    The title of this post could also have been "Move over Hyperlink, here comes Actionlink" or "Creating interactive text in Silverlight." But alas, there can be only one. Hyperlinks are very useful. However, they are also limited because their action is fixed: browse to a URL. This may have been adequate at the start of the Internet, but nowadays, in web applications, the one thing we do not want to happen is a complete change of context. In applications we typically like a hyperlink selection to initiate an action that updates a part of the screen. For instance, if my application has a map displayed with some text next to it, the map would react to a selection of a hyperlink in the text, e.g. by zooming in on a location and displaying additional locational information in a popup. In this way, the text becomes interactive text. It is quite common that one company creates and maintains websites for many client companies. To keep maintenance cost low, it is important that the content of these websites can be updated by the client companies themselves, without the need to involve a software engineer. To accommodate this scenario, we want the author of the interactive text to configure all hyperlinks (without writing any code). In a Silverlight RichTextArea, the default action of a Hyperlink is the same as a traditional hyperlink, but it can be changed: if the Command property has a value then upon a click event this command is called with the value of the CommandParameter as parameter. How can we let the author of the text specify a command for each hyperlink in the text, and how can we let an application react properly to a hyperlink selection event? We are talking about any command here. Obviously, the application would recognize only a specific set of commands, with well defined parameters, but the approach we take here is generic in the sense that it pertains to the RichTextArea and any command. So what do we require? We wish that: As a text author, I can configure the action of a hyperlink in a (rich) text without writing code; As a text author, I can persist the action of a hyperlink with the text; As a reader of persisted text, I can click a hyperlink and the configured action will happen; As an application developer, I can configure a control to use my application specific commands. In an excellent introduction to the RichTextArea, John Papa shows (among other things) how to persist a text created using this control. To meet our requirements, we can create a subclass of RichTextArea that uses John's code and allows plugging in two command specific components: one to prompt for a command definition, and one to execute the command. Since both of these plugins are application specific, our RichTextArea subclass should not assume anything about them except their interface. public interface IDefineCommand { void Prompt(string content, // the link content Action<string, object> callback); // the method called to convey the link definition } public interface IPerformCommand : ICommand {} The IDefineCommand plugin receives the content of the link (the text visible to the reader) and displays some kind of control that allows the author to define the link. When that's done, this (possibly changed) content string is conveyed back to the RichTextArea, together with an object that defines the command to execute when the link is clicked by the reader of the published text. The IPerformCommand plugin simply implements System.Windows.Input.ICommand. Let's use MEF to load the proper plugins. In the example solution there is a project that contains rudimentary implementations of these. The IDefineCommand plugin simply prompts for a command string (cf. a command line or query string), and the IPerformCommand plugin displays a MessageBox showing this command string. An actual application using this extended RichTextArea would have its own set of commands, each having their own parameters, and hence would provide more user friendly application specific plugins. Nonetheless, in any case a command can be persisted as a string and hence the two interfaces defined above suffice. For a Visual Studio 2010 solution, see my article on The Code Project.

    Read the article

  • OpenGL ES Shader help (Blending)

    - by Chris
    Earlier I required assistance getting to grips with how to retain the alpha channel of a transparent texture in my colourised texture shader program. Whilst playing with that first version of my program (before obtaining the solution to my first requirement), I managed to enable transparency for the whole texture (effectively blending via GLSL), and I quite liked this, and I would now like to know if and how it is possible to retain this blending effect, on top of the existing output without affecting the original alpha channel - as I don't know how to input this transparency via the parameter that is already being provided with the textures alpha channel. A basic example of the blending program I am referring to (minus any other functionality) is as follows... varying vec2 texCoord; uniform sampler2D texSampler; void main() { gl_FragColor = vec4(texture2D(texSampler,texCoord).xyz,0.5); } Where 0.5 is the transparency (blending effect) of the whole texture. This is the current version of my program, which provides the ability to colour a texture according the colour parameter passed to the program, and retains the alpha channel of the original texture. varying vec2 texCoord; uniform sampler2D texSampler; uniform vec3 colour; void main() { gl_FragColor = vec4(colour,1) * vec4(texture2D(texSampler,texCoord).xyz,texture2D(texSampler,texCoord).w); } I need to know if it is possible to apply transparency on top this program, without affecting the original alpha channel which I have already preserved. I hope this makes enough sense, I am sure it is possible, and if so I should imagine it is rather simple, but this has me stumped. Any help much appreachiated. Cheers, Chris

    Read the article

  • Dynamically creating meta tags in asp.net mvc

    - by Jalpesh P. Vadgama
    As we all know that Meta tag has very important roles in Search engine optimization and if we want to have out site listed with good ranking on search engines then we have to put meta tags. Before some time I have blogged about dynamically creating meta tags in asp.net 2.0/3.5 sites, in this blog post I am going to explain how we can create a meta tag dynamically very easily. To have meta tag dynamically we have to create a meta tag on server-side. So I have created a method like following. public string HomeMetaTags() { System.Text.StringBuilder strMetaTag = new System.Text.StringBuilder(); strMetaTag.AppendFormat(@"<meta content='{0}' name='Keywords'/>","Home Action Keyword"); strMetaTag.AppendFormat(@"<meta content='{0}' name='Descption'/>", "Home Description Keyword"); return strMetaTag.ToString(); } Here you can see that I have written a method which will return a string with meta tags. Here you can write any logic you can fetch it from the database or you can even fetch it from xml based on key passed. For the demo purpose I have written that hardcoded. So it will create a meta tag string and will return it. Now I am going to store that meta tag in ViewBag just like we have a title tag. In this post I am going to use standard template so we have our title tag there in viewbag message. Same way I am going save meta tag like following in ViewBag. public ActionResult Index() { ViewBag.Message = "Welcome to ASP.NET MVC!"; ViewBag.MetaTag = HomeMetaTags(); return View(); } Here in the above code you can see that I have stored MetaTag ViewBag. Now as I am using standard ASP.NET MVC3 template so we have our we have out head element in Shared folder _layout.cshtml file. So to render meta tag I have modified the Head tag part of _layout.cshtml like following. <head> <title>@ViewBag.Title</title> <link href="@Url.Content("~/Content/Site.css")" rel="stylesheet" type="text/css" /> <script src="@Url.Content("~/Scripts/jquery-1.5.1.min.js")" type="text/javascript"></script> @Html.Raw(ViewBag.MetaTag) </head> Here in the above code you can see I have use @Html.Raw method to embed meta tag in _layout.cshtml page. This HTML.Raw method will embed output to head tag section without encoding html. As we have already taken care of html tag in string function we don’t need the html encoding. Now it’s time to run application in browser. Now once you run your application in browser and click on view source you will find meta tag for home page as following. That’s its It’s very easy to create dynamically meta tag. Hope you liked it.. Stay tuned for more.. Till then happy programming.

    Read the article

  • Iterative Conversion

    - by stuart ramage
    Question Received: I am toying with the idea of migrating the current information first and the remainder of the history at a later date. I have heard that the conversion tool copes with this, but haven't found any information on how it does. Answer: The Toolkit will support iterative conversions as long as the original master data key tables (the CK_* tables) are not cleared down from Staging (the already converted Transactional Data would need to be cleared down) and the Production instance being migrated into is actually Production (we have migrated into a pre-prod instance in the past and then unloaded this and loaded it into the real PROD instance, but this will not work for your situation. You need to be migrating directly into your intended environment). In this case the migration tool will still know all about the original keys and the generated keys for the primary objects (Account, SA, etc.) and as such it will be able to link the data converted as part of a second pass onto these entities. It should be noted that this may result in the original opening balances potentially being displayed with an incorrect value (if we are talking about Financial Transactions) and also that care will have to be taken to ensure that all related objects are aligned (eg. A Bill must have a set to bill segments, meter reads and a financial transactions, and these entities cannot exist independantly). It should also be noted that subsequent runs of the conversion tool would need to be 'trimmed' to ensure that they are only doing work on the objects affected. You would not want to revalidate and migrate all Person, Account, SA, SA/SP, SP and Premise details since this information has already been processed, but you would definitely want to run the affected transactional record validation and keygen processes. There is no real "hard-and-fast" rule around this processing since is it specific to each implmentations needs, but the majority of the effort required should be detailed in the Conversion Tool section of the online help (under Adminstration/ The Conversion Tool). The major rule is to ensure that you only run the steps and validation/keygen steps that you need and do not do a complete rerun for your subsequent conversion.

    Read the article

  • Feasibility to take over a JavaMe Project by Coders who have no experience in JavaMe

    - by Stephenmjm
    As the original JavaMe team will leave to do other items. The JavaMe project will be taken over by some guys knowing nothing about JavaMe. Transition period: One month About this JavaMe project: about 3.5 million lines of code (more than 180 java file, SourceCode is 8.5KB in total) using the Polish, Proguard document: The JavaMe project itself have no document. No UML map. Difficulties I guess: familiar with the JavaMe, this should be okay In order to do the further development. We need to Read the sourceCode ---- It's not easy to read 3.5 million lines of code having not enough comment Adaptation work for more than 100 phone These are the questions, thank you! In the case of our guys have no experience in JavaMe, Is one month too hasty? In order to take the job in time . What we should ask the original JavaMe team to do . Considering we hava no experience in JavaMe. The complication we taking the Adaptation work without the original JavaMe team? Any other suggestions?

    Read the article

  • Do MORE with WebCenter

    - by Michael Snow
    We’ve been extremely busy here on the Oracle WebCenter team. We hope that you’ve all be keeping up with the interesting news each week. Last week was jammed full of GartnerPCC and Gartner360 buzz. If you missed any of the highlights – be sure to check out both Kellsey’s post from last week: Gartner PCC: A Shovel & Some Ah-Ha's and Christie’s overview of Loren Weinberg’s PCC presentation: "Here Today, Gone Tomorrow: Engage Your Customers or Lose Them"  . This week, we’ll be focusing on “Doing More with WebCenter” leading up to a great webcast scheduled for Thursday, March 22 (invite and registration link below). This is the 2nd in a series of 3 webcasts dedicated to expanding the understanding of the full capabilities of WebCenter. Yes – that might mean that you are not getting the full benefits of the software you already own or the expansion potential via upgrade to the full WebCenter Suite Plus. Tune in on Thursday 10 a.m. PT / 1 p.m. ET.  ++++++++++++++ Want to be a Speaker at Oracle OpenWorld 2012? Oracle Open World planning has already kicked off. We know that it is only March and next October is far in the distance. But planning has already started for Oracle OpenWorld 2012. So if you want to be a speaker and propose your own session for this year's event in San Francisco on September 30th - October 4th, starting thinking now!  The annual OpenWorld Call for Papers is now open until April 9th! All of the details to submit a paper are available here. Of course, the WebCenter team here is interested in sessions including case studies, thought-leadership, customer stories around any of the Oracle WebCenter solutions, but the Call for Papers is open to all Oracle topics. When submitting your topic, be sure to describe what you plan to discuss and the value of the presentation to other attendees. Sell your session, because there will be a lot of competition to be selected.  Bonus News: Speakers for selected sessions receive a complimentary full conference pass! Get your papers in and we'll see you in San Francisco! ~~~~~~~~~~~~~~~~~~~~~~ Webcast Series: Do More with Oracle WebCenter - Expand Beyond Content Management Enable Employees, Partners, and Customers to Do More with Your Content Dear [FIRSTNAME] [LASTNAME],-- Did you know that, in addition to content management, Oracle WebCenter now also includes comprehensive portal, composite application, collaboration, and Web experience management capabilities? Join us for this Webcast and learn how you can provide a new level of user engagement. Learn how Oracle WebCenter: Drives task-specific application data and content to a single screen for executing specific business processes Enables mixed internal and external environments where content can be securely shared and filtered with employees, partners, and customers, based upon role-based security Offers Web experience management, driving contextually relevant, social, and interactive online experiences across multiple channels Provides social features that enable sharing, activity feeds, collaboration, expertise location, and best-practices communities Learn how to do more with Oracle WebCenter. Register now for the Webcast. Register Now Join us for the second Webcast in the series "Do More With Oracle WebCenter". March 22, 2012 10 a.m. PT / 1 p.m. ET Presented by: Michelle Huff Senior Director, WebCenter Product Management, Oracle Greg Utecht Project Manager,IT Operations,TIES Copyright © 2012, Oracle and/or its affiliates. All rights reserved. Contact Us | Legal Notices | Privacy Oracle Corporation - Worldwide Headquarters, 500 Oracle Parkway, OPL - E-mail Services, Redwood Shores, CA 94065, United States

    Read the article

  • UPK Hands-on Labs at OHUG

    - by Karen Rihs
    Going to OHUG, June 18-22? Be sure to attend one or more UPK hands-on labs! Choose from Basic, Advanced, What's New, and Prebuilt Content!   Oracle User Productivity Kit 11.1 Workshop – Basic Stephen Armbruster, Oracle Corporation June 19, 2012, 11:00 a.m. – 12:00 p.m. June 20, 2012, 4:30 – 5:30 p.m. The User Productivity Kit (UPK) is a comprehensive, cost-effective, customizable solution that helps your organization quickly create the critical documentation, training, and support materials needed to drive project team and user productivity throughout the lifecycle of your software. The User Productivity Kit provides system process documentation, user acceptance test scripts, comprehensive instructor-led training materials, web-based training materials, role-based performance support, and complete documentation. Also provided is the UPK Developer, which serves as a single-source development and customization tool to enable rapid content creation and customization. The User Productivity Kit delivers: Business process documentation for fit-gap analysis - providing time and cost savings that jump-start your implementation or upgrade User Acceptance test scripts to help test applications prior to go-live State-of-the-art instructional design tools to rapidly build and tailor documentation, instructor-led training materials, and web-based training to fit organizational needs Live-application performance support with transactional and procedural information to maximize user efficiency. By registering for this hands-on UPK workshop, participants will use UPK to build an application job aid and simulation that can be used as performance support for the application. But hurry, space is limited! Oracle User Productivity Kit 11.1 Workshop – Advanced Stephen Armbruster, Oracle Corporation June 20, 2012, 1:30 – 2:30 p.m. This special workshop is for those already familiar with UPK and will cover advanced concepts. In this workshop, you will gain an in-depth knowledge of working with the UPK Developer. Following this workshop, you will be able to: Create publishing categories Add a logo to a publishing project Publish using the newly created category Configure your own library view Manage topic history in a multi-user environment Oracle User Productivity Kit 11.1 Workshop – What’s NEW! Stephen Armbruster, Oracle Corporation June 19, 2012, 1:30 – 2:30 p.m. June 21, 2012, 1:00 – 2:00 p.m. This special workshop is for those already familiar with UPK and will focus on the new features included in the latest version 11.1. In this workshop, you will review most of the new features included in the UPK Developer. Oracle User Productivity Kit 11.1 Workshop – Prebuilt Content Stephen Armbruster, Oracle Corporation June 19, 2012, 4:30 – 5:30 p.m. June 21, 2012, 2:15 – 3:15 p.m. This special workshop is for those already familiar with UPK and will focus on the latest version 11.1. At the end of this workshop, you will be able to demonstrate how to: Import prebuilt content Modify content frames Add a decision frame Translate a topic into Spanish Stephen Armbruster is a principal sales consultant, specializing in HCM and UPK applications for Oracle over the past twelve years. In addition to his current role, he serves as an ambassador for the Fusion User Experience (UX) team and is tasked with evangelizing the UX for end users across all Oracle brands (Fusion, PSFT, JDE, and EBS).  He is also a trusted advisor to Oracle’s Product Management teams related to Learning Management Systems (LMS). Prior to joining Oracle, he was an instructor as well as an instructional technologist working in the medical diagnostics, high tech, and information management industries. As an expert in both LMS and UPK, he regularly speaks at Oracle conferences including Oracle OpenWorld and OHUG on topics that span using Oracle solutions to accomplish employee training, certification, and user adoption. His presentations are both entertaining and engaging.

    Read the article

  • Taking web sites offline for demonstration

    While working in software development in general, and in web development for a couple of customers it is quite common that it is necessary to provide a test bed where the client is able to get an image, or better said, a feeling for the visions and ideas you are talking about. Usually here at IOS Indian Ocean Software Ltd. we set up a demo web site on one of our staging servers, and provide credentials to the customer to access and review our progress and work ad hoc. This gives us the highest flexibility on both sides, as the test bed is simply online and available 24/7. We can update the structure, the UI and data at any time, and the client is able to view it as it suits best for her/him. Limited or lack of online connectivity But what is going to happen when your client is not capable to be online - no matter for what reasons; here are some more obvious ones: No internet connection (permanently or temporarily) Expensive connection, ie. mobile data package, stay at a hotel, etc. Presentation devices at an exhibition, ie. using tablets or iPads Being abroad for a certain time, and only occasionally online No network coverage, especially on mobile Bad infrastructure, like ie. in Third World countries Providing a catalogue on CD or USB pen drive Anyway, it doesn't matter really. We should be able to provide a solution for the circumstances of our customers. Presentation during an exhibition Recently, we had the following request from a customer: Is it possible to let us have a desktop version of ResortWork.co.uk that we can use for demo purposes at the forthcoming Ski Shows? It would allow us to let stand visitors browse the sites on an iPad to view jobs and training directory course listings. Yes, sure we can do that. Eventually, you might think why don't they simply use 3G enabled iPads for that purpose? As stated above, there might be several reasons for that - low coverage, expensive data packages, etc. Anyway, it is not a question on how to circumvent the request but to deliver a solution to that. Possible solutions... or not? We already did offline websites earlier, and even established complete mirrors of one or two web sites on our systems. There are actually several possibilities to handle this kind of request, and it mainly depends on the system or device where the offline site should be available on. Here, it is clearly expressed that we have to address this on an Apple iPad, well actually, I think that they'd like to use multiple devices during their exhibitions. Following is an overview of possible solutions depending on the technology or device in use, and how it can be done: Replication of source files and database The above mentioned web site is running on ASP.NET, IIS and SQL Server. In case that a laptop or slate runs a Windows OS, the easiest way would be to take a snapshot of the source files and database, and transfer them as local installation to those Windows machines. This approach would be fully operational on the local machine. Saving pages for offline usage This is actually a quite tedious job but still practicable for small web sites Tool based approach to 'harvest' the web site There quite some tools in the wild that could handle this job, namely wget, httrack, web copier, etc. Screenshots bundled as PDF document Not really... ;-) Creating screencast or video Simply navigate through your website and record your desktop session. Actually, we are using this kind of approach to track down difficult problems in order to see and understand exactly what the user was doing to cause an error. Of course, this list isn't complete and I'd love to get more of your ideas in the comments section below the article. Preparations for offline browsing The original website is dynamically and data-driven by ASP.NET, and looks like this: As we have to put the result onto iPads we are going to choose the tool-based approach to 'download' the whole web site for offline usage. Again, depending on the complexity of your web site you might have to check which of the applications produces the best results for you. My usual choice is to use wget but in this case, we run into problems related to the rewriting of hyperlinks. As a consequence of that we opted for using HTTrack. HTTrack comes in different flavours, like console application but also as either GUI (WinHTTrack on Windows) or Web client (WebHTTrack on Linux/Unix/BSD). Here's a brief description taken from the original website about HTTrack: HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility. It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure. Simply open a page of the "mirrored" website in your browser, and you can browse the site from link to link, as if you were viewing it online. And there is an extensive documentation for all options and switches online. General recommendation is to go through the HTTrack Users Guide By Fred Cohen. It covers all the initial steps you need to get up and running. Be aware that it will take quite some time to get all the necessary resources down to your machine. Actually, for our customer we run the tool directly on their web server to avoid unnecessary traffic and bandwidth. After a couple of runs and some additional fine-tuning - explicit inclusion or exclusion of various external linked web sites - we finally had a more or less complete offline version available. A very handsome feature of HTTrack is the error/warning log after completing the download. It contains some detailed information about errors that appeared on the pages and the links within the pages that have been processed. Error: "Bad Request" (400) at link www.resortwork.co.uk/job-details_Ski_hire:tech_or_mgr_or_driver_37854.aspx (from www.resortwork.co.uk/Jobs_A_to_Z.aspx)Error: "Not Found" (404) at link www.247recruit.net/images/applynow.png (from www.247recruit.net/css/global.css)Error: "Not Found" (404) at link www.247recruit.net/activate.html (from www.247recruit.net/247recruit_tefl_jobs_network.html) In our situation, we took the records of HTTP 400/404 errors and passed them to the web development department. Improvements are to be expected soon. ;-) Quality assurance on the full-featured desktop Unfortunately, the generated output of HTTrack was still incomplete but luckily there were only images missing. Being directly on the web server we simply copied the missing images from the original source folder into our offline version. After that, we created an archive and transferred the file securely to our local workspace for further review and checks. From that point on, it wasn't necessary to get any more files from the original web server, and we could focus ourselves completely on the process of browsing and navigating through the offline version to isolate visual differences and functional problems. As said, the original web site runs on ASP.NET Web Forms and uses Postback calls for interaction like search, pagination and partly for navigation. This is the main field of improving the offline experience. Of course, same as for standard web development it is advised to test with various browsers, and strangely we discovered that the offline version looked pretty good on Firefox, Chrome and Safari, but not in Internet Explorer. A quick look at the HTML source shed some light on this, and there are conditional CSS inclusions based on the user agent. HTTrack is not acting as Internet Explorer and so we didn't have the necessary overrides for this browser. Not problematic after all in our case, but you might have to pay attention to this and get the IE-specific files explicitly. And while having a view at the source code, we also found out that HTTrack actually modifies the generated HTML output. In several occasions we discovered that <div> elements were converted into <table> constructs for no obvious reason; even nested structures. Search 'e'nd destroy - sed (or Notepad++) to the rescue During our intensive root cause analysis for a couple of HTML/CSS problems that needed some extra attention it is very helpful to be familiar with any editor that allows search and replace over multiple files like, ie. sed - stream editor for filtering and transforming text on Linux or my personal favourite Notepad++ on Windows. This allowed us to quickly fix a lot of anchors with onclick attributes and Javascript code that was addressed to ASP.NET files instead of their generated HTML counterparts, like so: grep -lr -e '.aspx' * | xargs sed -e 's/.aspx/.html?/g' The additional question mark after the HTML extension helps to separate the query string from the actual target and solved all our missing hyperlinks very fast. The same can be done in Notepad++ on Windows, too. Just use the 'Replace in files' feature and you are settled. Especially, in combination with Regular Expressions (regex). Landscape of browsers Okay, after several runs of HTML/CSS code analysis, searching and replacing some strings in a pool of more than 4.000 files, we finally had a very good match of an offline browsing experience in Firefox and Chrome on Linux. Next, we transferred that modified set of files to a Windows 8 machine for review on Firefox, Chrome and Internet Explorer 7 to 10, and a Mac mini running Mac OS X 10.7 to check the output on Safari and again on Chrome. Besides IE, for reasons already mentioned above, the results were identical. And last but not least it was about to check web site on tablets. Please continue to read on the following articles: Taking web sites offline for demonstration on Galaxy Tablet Taking web sites offline for demonstration on iPad

    Read the article

  • case-specific mod rewrite on Wordpress subdomain multisite

    - by Steve
    I have split a Wordpress blog into multiple category-specific blogs using subdomains, as the topics in the original blog were too broad to be lumped together effectively. Posts were exported from the parent www blog and imported into the subject-specific subdomain blogs. I believe .htaccess provides mod rewrite for all subdomains (including the original www) in a single .htaccess file. I use .htaccess to perform 301 redirect on post categories to the relevant post on the subdomain's blog. eg: RedirectMatch 301 ^/auto/(.*)$ http://auto.example.com/$1 The problem I have is that the category has been retained in the permalink structure in the subdomain blog, so that www.example.com/auto/mercedes is now auto.example.com/auto/mercedes. The 1st URL is redirect to the 2nd, but unfortunately, the 2nd URL is redirected to auto.example.com/mercedes using the same rewrite rule, which is not found, as the permalink on the subdomain's blog retains the parent category of auto. The solution would be to adjust the permalink structure in the subdomain's WP settings, so that the top level category does not duplicate the subdomain. My question would be: how do I then strip a section of the original (www) blog's post URL from the subdomain's URL when redirecting? eg: How do I redirect www.example.com/auto/mercedes to auto.example.com/mercedes? I'm assuming this would be a regular expression trick, which I am not great at. Update: I might have to use: RewriteCond %{HTTP_HOST} !auto.example.com$ in the default Wordpress if loop in .htaccess, and seperate my custom subdomain redirections into a second if loop section.

    Read the article

  • Cutting Subscriber Churn with Media Intelligence

    - by Oracle M&E
    There's lots of talk in media and entertainment companies about using "big data".  But it's often hard to see through the hype and understand how big data brings benefits in the real world.  How about being able to predict with 92% accuracy which subscribers intend to cancel their subscription - and put in place a renewal strategy to dramatically reduce that churn?  That's what Belgian media company De Persgroep has achieved with Oracle's Media Intelligence solution.  "One of the areas in which we're able to achieve beautiful results using big data is the churn prediction," De Persgroep's CIO Luc Verbist explains in a new Oracle video.  "Based on all the data that we collect on websites and all your behavior, payment behavior and so on, we're able to make a prediction model, which, with an accuracy of 92 percent, is able to predict that you probably won't renew your newspaper, anymore. So our approach to renewal is completely different to the people in that segment than towards the other people. And this has brought us a lot of value and a lot of customers who didn't stop their newspaper where else they would have done so." De Persgroep is using Oracle's Big Data Appliance, along with software from Oracle partner NGDATA to build up a detailed "DNA profile" of each individual customer, based on every interaction, in real time.  This means that any change in behavior - a drop in content consumption, a late subscription payment, a negative social media comment - is captured.  Applying advanced data modeling techniques automatically converts those raw interactions into data with real business meaning - like that customer's risk of churning. The very same data profile - comprising hundreds if individual dimensions - can simultaneously drive targeted marketing campaigns - informing audience about new content that's most relevant and encouraging them to subscribe.  It can power content recommendations and personalization right in the content sites and apps. And it can link directly into digital advertising networks via platforms like Oracle's BlueKai data management platform (DMP), to drive increased advertising CPMs. Using Oracle's Media Intelligence solution enables this across De Persgroep's business - comprising eight newspapers and 25 magazines published in Belgium and The Netherlands, and digital properties including websites with 6m daily unique visitors, along with TV and radio stations. "The company strategy is in fact a customer-centric strategy, so we want to get a 360-view about our customers, about our prospects. And the big data project helped us to achieve that goal," says Verbist. Using Oracle's Big Data Appliance to underpin the solution created huge savings.   "The selection of the Big Data Appliance was quite easy.  It was very quick to install, very easy to install, as well. And it was far cheaper than building our own Hadoop cluster. So it was in fact a non-brainer," Verbist explains. Applying Media Intelligence approach has yielded incredible results for De Persgroep, including: Improved products - with a new understanding of how readers are consuming print and digital content across the day Improved customer segmentation - driving a 6X improvement in customer prospecting and acquisition when contacting a specific segment Having the project up and running in three months And that has led to competitive benefits for De Persgroep, as Luc Verbist explains: "one of the results we saw since we started using big data is that we're able to increase the gap between we as the market leader, and the second [by] more than 20 percent."

    Read the article

  • Best of OTN - Week of Oct 21st

    - by CassandraClark-OTN
    This week's Best of OTN, for you, the best devs, dba's, sysadmins and architects out there!  In these weekly posts the OTN team will highlight the top content from each community; Architect, Database, Systems and Java.  Since we'll be publishing this on Fridays, we'll also mix in a little fun! Architect Community Top Content- The Road Ahead for WebLogic 12c | Edwin BiemondOracle ACE Edwin Biemond shares his thoughts on announced new features in Oracle WebLogic 12.1.3 & 12.1.4 and compares those upcoming releases to Oracle WebLogic 12.1.2. A Roadmap for SOA Development and Delivery | Mark NelsonDo you know the way to S-O-A? Mark Nelson does. His latest blog post, part of an ongoing series, will help to keep you from getting lost along the way. Updated ODI Statement of Direction | Robert SchweighardtHeads up Oracle Data Integrator fans! A new statement of product direction document is available, offering an overview of the strategic product plans for Oracle’s data integration products for bulk data movement and transformation, specifically Oracle Data Integrator (ODI) and Oracle Warehouse Builder (OWB). Bob Rhubart, Architect Community Manager Friday Funny - "Some people approach every problem with an open mouth." — Adlai E. Stevenson (October 23, 1835 – June 14, 1914) 23rd Vice President of the United States Database Community Top Content - Pre-Built Developer VMs (for Oracle VM VirtualBox)Heard all the chatter about Oracle VirtualBox? Over 1 million downloads per week and look: pre-built virtual appliances designed specifically for developers. Video: Big Data, or BIG DATA?Oracle Ace Director Ben Prusinski explains the differences.?? Webcast Series - Developing Applications in Oracle's Public CloudTime to get started on developing and deploying cloud applications by moving to the cloud. Good friend Gene Eun from Oracle's Cloud team posted this two-part Webcast series that has an overview and demonstration of the Oracle Database Cloud Service. Check out the demos on how to migrate your data to the cloud, extend your application with interactive reporting, and create and access RESTful Web services. Registration required, but so worth it! Laura Ramsey, Database Community Manager Friday Funny - Systems Community Top Content - Video: What Kind of Scalability is Better, Horizontal or Vertical?Rick Ramsey asks the question "Is Oracle's approach to large vertically scaled servers at odds with today's trend of combining lots and lots of small, low-cost servers systems with networking to build a cloud, or is it a better approach?" Michael Palmeter, Director of Solaris Product Management, and Renato Ribeiro, Director Product Management for SPARC Servers, discuss.Video: An Engineer Takes a Minute to Explain CloudBart Smaalders, long-time Oracle Solaris core engineer, takes a minute to explain cloud from a sysadmin point of view. ?Hands-On Lab: How to Deploy and Manage a Private IaaS Cloud Soup to nuts. This lab shows you how to set up and manage a private cloud with Oracle Enterprise Manager Cloud Control 12c in an Infrastructure as a service (IaaS) model. You will first configure the IaaS cloud as the cloud administrator and then deploy guest virtual machines (VMs) as a self-service user. Rick Ramsey, Systems Community Manager Friday Funny - Video: Drunk Airline Pilot - Dean Martin - Foster Brooks Java Community Top Content - Video: NightHacking Interview with James GoslingJames Gosling, the Father of Java, discusses robotics, Java and how to keep his autonomous WaveGliders in the ocean for weeks at a time. Live from Hawaii.  Video: Raspberry Pi Developer Challenge: Remote Controller A developer who knew nothing about Java Embedded or Raspberry Pi shows how he can now control a robot with his phone. The project was built during the Java Embedded Challenge for Raspberry Pi at JavaOne 2013.Java EE 7 Certification Survey - Participants NeededHelp us define how to server your training and certification needs for Java EE 7. Tori Wieldt, Java Community Manager Friday Funny - Programmers have a strong sensitivity to Yak's pheromone. Causes irresistible desire to shave said Yak. Thanks, @rickasaurus! To follow and take part in the conversation follow/like etc. at one or all of the resources below -  OTN TechBlog The Java Source Blog The OTN Garage Blog The OTN ArchBeat Blog @oracletechnet @java @OTN_Garage @OTNArchBeat @OracleDBDev OTN I Love Java OTN Garage OTN ArchBeat Oracle DB Dev OTN Java

    Read the article

  • Mass Metadata Updates with Folders

    - by Kyle Hatlestad
    With the release of WebCenter Content PS5, a new folder architecture called 'Framework Folders' was introduced.  This is meant to replace the folder architecture of 'Folders_g'.  While the concepts of a folder structure and access to those folders through Desktop Integration Suite remain the same, the underlying architecture of the component has been completely rewritten.  One of the main goals of the new folders is to scale better at large volumes and remove the limitations of 1000 content items or sub-folders within a folder.  Along with the new architecture, it has a new look and a few additional features have been added.  One of those features are Query Folders.  These are folders that are populated simply by a query rather then literally putting items within the folders.  This is something that the Library has provided, but it always took an administrator to define them through the Web Layout Editor.  Now users can quickly define query folders anywhere within the standard folder hierarchy.   Within this new Framework Folders is the very handy ability to do metadata updates.  It's similar to the Propagate feature in Folders_g, but there are some key differences that make this very flexible and much more powerful. It's used within regular folders and Query Folders.  So the content you're updating doesn't all have to be in the same folder...or a folder at all.   The user decides what metadata to propagate.  In Folders_g, the system administrator controls which fields will be propagated using a single administration page.  In Framework Folders, the user decides at that time which fields they want to update. You set the value you want on the propagation screen.  In Folders_g, it used the metadata defined on the parent folder to propagate.  With Framework Folders, you supply the new metadata value when you select the fields you want to update.  It does not have to be defined on the parent folder. Because of these differences, I think the new propagate method is much more useful.  Instead of always having to rely on Archiver or a custom spreadsheet, you can quickly do mass metadata updates right within folders.   Here are the basic steps to perform propagation. First create a folder for the propagation.  You can use a regular folder, but a Query Folder will work as well. Go into the folder to get the results.   In the Edit menu, select 'Propagate'. Select the check-box next to the field to update and enter the new value  Click the Propagate button. Once complete, a dialog will appear showing it is complete What's also nice is that the process happens asynchronously in the background which means you can browse to other pages and do other things while it is still working.  You aren't stuck on the page waiting for it to complete.  In addition, you can add a configuration flag to the server to turn on a status indicator icon.  Set 'FldEnableInProcessIndicator=1' and it will show a working icon as its doing the propagation. There is a caveat when using the propagation on a Query Folder.   While a propagation on a regular folder will update all of the items within that folder, a Query Folder propagation will only update the first 50 items.  So you may need to run it multiple times depending on the size...and have the query exclude the items as they get updated. One extra note...Framework Folders is offered as the default folder architecture in the PS5 release of WebCenter Content.  But if you are using WebCenter Content integrated with another product that makes use of folders (WebCenter Portal/Spaces, Fusion Applications, Primavera, etc), you'll need to continue using Folders_g until they are updated to use the new folders.

    Read the article

  • Pages in IE render differently when served through the ASP.NET Development server and Production Ser

    - by rajbk
    You see differences in the way IE renders your web application locally on the ASP.NET Development server compared to your production server. Comparing the response from both servers including response headers and CSS show no difference. The issue may occur because of a setting in IE. In IE, go to Tools –> Compatibility ViewSettings. The checkbox “Display intranet sites in Compatibility View” turned on forces IE8 to display the web application content in a way similar to how Internet Explorer 7 handles standards mode web pages. Since your local web server is considered to be in the intranet zone, IE uses “Compatibility View” to render your pages. While you could uncheck this setting in or propagate the change to all developers through group policy settings, a different way is described below. To force IE to mimic the behavior of a certain version of IE when rendering the pages, you use the meta element  to include a “X-UA-Compatible” http-equiv header in  your web page or have it sent as part of the header by adding it to your web.config file. The values are listed below: <meta http-equiv="X-UA-Compatible" content="IE=4"> <!-- IE5 mode --> <meta http-equiv="X-UA-Compatible" content="IE=7.5"> <!-- IE7 mode --> <meta http-equiv="X-UA-Compatible" content="IE=100"> <!-- IE8 mode --> <meta http-equiv="X-UA-Compatible" content="IE=a"> <!-- IE5 mode --> This value can also be set in web.config like so: <?xml version="1.0" encoding="utf-8"?> <configuration> <system.webServer> <httpProtocol> <customHeaders> <clear /> <add name="X-UA-Compatible" value="IE=EmulateIE7" /> </customHeaders> </httpProtocol> </system.webServer> </configuration> The setting can added in the IIS metabase as described here. Similarly, you can do the same in Apache by adding the directive in httpd.conf <Location /store> Header set X-UA-Compatible “IE=EmulateIE7” </Location> Even though it can be done on a site level, I recommend you do it on a per application level to avoid confusing the developer. References Defining Document Compatibility Implementing the META Switch on IIS Implementing the META Switch on Apache

    Read the article

  • Refactoring this code that produces a reverse-lookup hash from another hash

    - by Frank Joseph Mattia
    This code is based on the idea of a Form Object http://blog.codeclimate.com/blog/2012/10/17/7-ways-to-decompose-fat-activerecord-models/ (see #3 if unfamiliar with the concept). My actual code in question may be found here: https://gist.github.com/frankjmattia/82a9945f30bde29eba88 The code takes a hash of objects/attributes and creates a reverse lookup hash to keep track of their delegations to do this. delegate :first_name, :email, to: :user, prefix: true But I am manually creating the delegations from a hash like this: DELEGATIONS = { user: [ :first_name, :email ] } At runtime when I want to look up the translated attribute names for the objects, all I have to go on are the delegated/prefixed (have to use a prefix to avoid naming collisions) attribute names like :user_first_name which aren't in sync with the rails i18n way of doing it: en: activerecord: attributes: user: email: 'Email Address' The code I have take the above delegations hash and turns it into a lookup table so when I override human_attribute_name I can get back the original attribute name and its class. Then I send #human_attribute_name to the original class with the original attribute name as its argument. The code I've come up with works but it is ugly to say the least. I've never really used #inject so this was a crash course for me and am quite unsure if this code effective way of solving my problem. Could someone recommend a simpler solution that does not require a reverse lookup table or does that seem like the right way to go? Thanks, - FJM

    Read the article

  • WPF: Running code when Window rendering is completed

    - by Ilya Verbitskiy
    WPF is full of surprises. It makes complicated tasks easier, but at the same time overcomplicates easy  task as well. A good example of such overcomplicated things is how to run code when you’re sure that window rendering is completed. Window Loaded event does not always work, because controls might be still rendered. I had this issue working with Infragistics XamDockManager. It continued rendering widgets even when the Window Loaded event had been raised. Unfortunately there is not any “official” solution for this problem. But there is a trick. You can execute your code asynchronously using Dispatcher class.   Dispatcher.BeginInvoke(new Action(() => Trace.WriteLine("DONE!", "Rendering")), DispatcherPriority.ContextIdle, null);   This code should be added to your Window Loaded event handler. It is executed when all controls inside your window are rendered. I created a small application to prove this idea. The application has one window with a few buttons. Each button logs when it has changed its actual size. It also logs when Window Loaded event is raised, and, finally, when rendering is completed. Window’s layout is straightforward.   1: <Window x:Class="OnRendered.MainWindow" 2: xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" 3: xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" 4: Title="Run the code when window rendering is completed." Height="350" Width="525" 5: Loaded="OnWindowLoaded"> 6: <Window.Resources> 7: <Style TargetType="{x:Type Button}"> 8: <Setter Property="Padding" Value="7" /> 9: <Setter Property="Margin" Value="5" /> 10: <Setter Property="HorizontalAlignment" Value="Center" /> 11: <Setter Property="VerticalAlignment" Value="Center" /> 12: </Style> 13: </Window.Resources> 14: <StackPanel> 15: <Button x:Name="Button1" Content="Button 1" SizeChanged="OnSizeChanged" /> 16: <Button x:Name="Button2" Content="Button 2" SizeChanged="OnSizeChanged" /> 17: <Button x:Name="Button3" Content="Button 3" SizeChanged="OnSizeChanged" /> 18: <Button x:Name="Button4" Content="Button 4" SizeChanged="OnSizeChanged" /> 19: <Button x:Name="Button5" Content="Button 5" SizeChanged="OnSizeChanged" /> 20: </StackPanel> 21: </Window>   SizeChanged event handler simply traces that the event has happened.   1: private void OnSizeChanged(object sender, SizeChangedEventArgs e) 2: { 3: Button button = (Button)sender; 4: Trace.WriteLine("Size has been changed", button.Name); 5: }   Window Loaded event handler is slightly more interesting. First it scheduler the code to be executed using Dispatcher class, and then logs the event.   1: private void OnWindowLoaded(object sender, RoutedEventArgs e) 2: { 3: Dispatcher.BeginInvoke(new Action(() => Trace.WriteLine("DONE!", "Rendering")), DispatcherPriority.ContextIdle, null); 4: Trace.WriteLine("Loaded", "Window"); 5: }   As the result I had seen these trace messages.   1: Button5: Size has been changed 2: Button4: Size has been changed 3: Button3: Size has been changed 4: Button2: Size has been changed 5: Button1: Size has been changed 6: Window: Loaded 7: Rendering: DONE!   You can find the solution in GitHub.

    Read the article

  • Update Since Microsoft/PSC Office Open XML Case Study

    - by Tim Murphy
    In 2009 Microsoft released a case study about a project that we had done using the OOXML SDK 1.0 for Research Directors Inc.  Since that time Microsoft has released version 2.0 of the SDK and PSC has done significant development with it.  Below are some of the mile stones we have reached since the original case study. At the time of the original case study two report types had been automated to output as PowerPoint presentations.  Now that the all the main products have been delivered we have added three reports with Word document outputs and five more reports with PowerPoint outputs. One improvement we made over the original application was to create a PowerPoint Add-In which allows the users to tag a slide.  These tags along with the strongly typed SDK 2.0 allows for the code to use LINQ to easily search for slides in the template files.  This allows for a more flexible architecture base on assembling a presentation from copied slide extracted from the template. The new library we created also enabled us to create two new Word based reports in two weeks.  The library we created abstracts the generation of the documents from the business logic and the data retrieval.  The key to this is the mark up.  Content Controls are a good method for identifying sections of a template to be modified or replaced.  Join this with the concept of all data being generically either scalar or two dimensional and the code becomes more generic. In the end we found the OOXML SDK 2.0 to be a great tool for accelerating document generation development and creating happy clients.  del.icio.us Tags: PSC Group,OOXML,Case Study,Office Open XML,Word,PowerPoint

    Read the article

  • Where Twitter Stands Heading Into 2013

    - by Mike Stiles
    As Twitter continued throughout 2012 to be a stage on which global politics and culture played itself out, the company itself underwent some adjustments that give us a good indication of what users and brands can expect from the platform in 2013. The power of the network did anything but fade. Celebrities continued to use it to connect one-on-one. Even the Pope signed on this year. It continued to fuel revolutions. It played an exponentially large factor in this US Presidential election. And around the world, the freedom to speak was challenged as users were fired, sued, sometimes even jailed for their tweets. Expect more of the same in 2013, as Twitter has entrenched itself, for individuals, causes and brands, as the fastest, easiest, most efficient way to message the masses so some measure of impact can come from it. It’s changed everything, and it’s not finished. These fun facts reveal the position of strength with which Twitter enters 2013: It now generates a billion tweets every 2.5 days It has 500 million+ users The average Twitter user has tweeted 307 times 32% of everyone using the Internet uses Twitter It’s expected to bring in $540 million in ad revenue by 2014 11 new accounts are created every second High-level Executive Summary: people love it, people use it, and they’re going to keep loving and using it. Whether or not outside developers love it is a different matter. 2012 marked a shift from welcoming the third party support that played at least some role in Twitter being so warmly embraced, to discouraging anything that replicates what Twitter can do itself…or plans to do itself. It’s not the open playground it once was. Now Twitter must spend 2013 proving it can innovate in-house and keep us just as entranced. Likewise, Twitter is distancing itself from Facebook. Images from the #1 platform’s Instagram don’t work on Twitter anymore, and Twitter’s rolling out their own photo filter product. Where the two have lived in a “plenty of room for everybody” symbiosis up to now, 2013 could see the giants ramping up a full-on rivalry. Twitter is exhibiting a deliberate strategy. Updates have centered on more visually appealing search results, and making finding and sharing content easier. Deals have been cut with some media entities so their content stands out. CEO Dick Costolo has said tweets aren’t the attraction, they’re what leads you to content. Twitter aims to be a key distributor of media and info. Add the addition of former News Corp. President Peter Chernin to the board, and their hashtag landing page experience for events, and their media behemoth ambitions get pretty clear. There are challenges ahead and Costolo has also laid those out; entry into China, figuring out how to have Twitter deliver both comprehensive and relevant, targeted experiences, and the visualization of big data. What does this mean for corporations? They can expect a more media-rich evolution and growing emphases on imagery. They can expect more opportunities to create great media content and leverage Twitter for its distribution. And they can expect new ways to surface in searches. Are brands diving in? 56% of customer tweets to companies get completely and totally ignored. Ugh. A study Twitter recently conducted with Compete shows people who see tweets from retailers are more likely to buy a product. And, the more retailer tweets they see, the more likely they are to purchase on the retail site. As more of those tweets point to engaging media content from the brand, the results should get even better. Twitter appears ready for 2013. Enterprise brands have some work to do. @mikestilesPhoto Stuart Miles, freedigitalphotos.net

    Read the article

  • OPN Specialized Latest News (15th November)

    - by swalker
    HELPING YOU TO SPECIALIZE WebCenter Implementation Specialist Exam Preparation Webcasts: WebCenter Content And WebCenter Portal Oracle Partner Network would like to invite you to Refresh Courses for WebCenter Content and WebCenter Portal, to help partners to prepare for the WebCenter Implementation Specialist EXAMS. This is a 3 hours intensive refresher partner-only training session, providing attendees with an overview of WebCenter Content and WebCenter Portal functions and related topics. After the refresher part you will be able to take the relevant Implementation Specialist EXAM depending on your personal focus. NOTE: This is only suitable for experienced WebCenter Content or WebCenter Portal practitioners Who should attend? Partner Consultants who want to become an Oracle WebCenter Content or a WebCenter Portal Certified Implementation Specialist or both, that will help them to differentiate themselves in front of customers and support their Companies to become Specialized. Webcast Details: Click here to read more... Specialized Partners Only! New Service to Promote Your Events The Partner Event Publisher has just been made available to all specialized partners in EMEA.  Partners now have the opportunity to publish their events to the Oracle.com/events site and spread the word on their upcoming live in-person and/or live webcast events. Click here to read more information and watch a short video demo. VADs Get Specialized Effective November 1, 2011 , VADs, with a valid Value Added Distributor Agreement will no longer be required to meet customer reference requirements outlined in the business criteria section in order to become specialized. VADs must continue meet all other business and competency criteria set forth in the applicable Knowledge Zone prior to specialization approval. New Certification Pillar Axiom 600 Storage System Your opportunity to take the Pillar Axiom 600 Storage System Essentials (1Z0-581) Exam is vailable now in beta. Pass the exam so you can become a Pillar Axiom 600 Storage Systems Implementation Specialist! Free vouchers are available for Oracle Partners! If you would like to receive a free Beta exam voucher, please send your request to [email protected] and include your name, business email address, company, and the Exam name Pillar Axiom 600 Storage System Essentials Beta exam. New Certification Available: Oracle Utilities Customer Care and Billing Oracle Utilities Customer Care and Billing 2 Essentials (1Z0-562) is a solution designed to help you meet market windows and regulatory deadlines while enjoying a low total cost of ownership and a high return on investment. Take the exam now to become an  Oracle Utilities Customer Care and Billing 2 Essentials Implementation Specialists. MEASURING YOUR SUCCESS We had 1674 Specialized Partners covering 5364 Specializations. Please note that due to OPN contract renewals at any given point in time there are valid Specialized Partners and Specializations which are temporarily not captured in the total statistics. An incremental 1961 individuals were accredited as Implementation Specialists giving an EMEA cumulative total of 9598 Implementation Specialists 26 ISVs obtained one or more Ready's, for a total of 53 Ready's Don't forget! You can submit your own press releases to Oracle! Every time you achieve specialization we'd like to support you getting the message out! Press guidelines and a submission link can be found on the OPN Portal here.

    Read the article

  • Optimal sprite size for rotations

    - by Panda Pajama
    I am making a sprite based game, and I have a bunch of images that I get in a ridiculously large resolution and I scale them to the desired sprite size (for example 64x64 pixels) before converting them to a game resource, so when draw my sprite inside the game, I don't have to scale it. However, if I rotate this small sprite inside the game (engine agnostically), some destination pixels will get interpolated, and the sprite will look smudged. This is of course dependent on the rotation angle as well as the interpolation algorithm, but regardless, there is not enough data to correctly sample a specific destination pixel. So there are two solutions I can think of. The first is to use the original huge image, rotate it to the desired angles, and then downscale all the reaulting variations, and put them in an atlas, which has the advantage of being quite simple to implement, but naively consumes twice as much sprite space for each rotation (each rotation must be inscribed in a circle whose diameter is the diagonal of the original sprite's rectangle, whose area is twice of that original rectangle, supposing square sprites). It also has the disadvantage of only having a predefined set of rotations available, which may be okay or not depending on the game. So the other choice would be to store a larger image, and rotate and downscale while rendering, which leads to my question. What is the optimal size for this sprite? Optimal meaning that a larger image will have no effect in the resulting image. This is definitely dependent on the image size, the amount of desired rotations without data loss down to 1/256, which is the minimum representable color difference. I am looking for a theoretical general answer to this problem, because trying a bunch of sizes may be okay, but is far from optimal.

    Read the article

  • Paper-free Customer Engagement

    - by Michael Snow
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} 12.00 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Appropriate repost from our friends at the AIIM blog: Digital Landfill -- John Mancini, supporting our mission of enabling customer engagement through better technology choices.  ---------- My wife didn't even give me a card for #wpfd - and they say husbands are bad at remembering anniversaries Well, today is the third World Paper Free Day.  I just got off the Tweet Jam, and there was a host of ideas for getting rid of -- or at least reducing -- paper. When we first started talking about "paper-free" most of the reasons raised to pursue this direction were "green" reasons.  I'm glad to see that the thinking has moved on to questions about how getting rid of paper and digitizing processes helps improve customer engagement.  And the bottom line.  And process responsiveness.  Not that the "green" reasons have gone away, but it's nice to see a maturation in the BUSINESS reasons to get rid of paper. Our World Paper Free Handbook (do not, do not, do not print it!) looks at how less paper in the workplace delivers significant benefits. Key findings show eliminating paper from processes can improve the responsiveness of customer service by 300 percent. Removing paper from business processes and moving content to PCs and tablets has the added advantage of helping companies adopt mobile-enable processes and eliminate elapsed time, lost forms, poor data and re-keying. To effectively mobile-enable processes and reduce reliance on paper, data should be captured as close to the point of origination as possible, which makes information easily available to whomever needs it, wherever they are, in the shortest time possible. This handbook summarizes the value of automating manual, paper-based processes. It then goes a step beyond to provide actionable steps that will set you on the path to productivity, profitability, and, yes, less paper.  Get your copy today and send the link around to your peers and colleagues.  Here's the link; please share it! http://www.aiim.org/Research-and-Publications/Research/AIIM-White-Papers/WPFD-Revolution-Handbook And don't miss out on the real world discussions about increasing engagement with WebCenter in new webinars being offered over the next couple of weeks:  October 30, 2012:  ResCare Solves Content Lifecycle Challenges with Oracle WebCenter November 1, 2012: WebCenter Content for Applications: Streamline Processes with Oracle WebCenter Content Management for Human Resources Applications Available On-Demand:  Using Oracle WebCenter to Content-Enable Your Business Applications

    Read the article

< Previous Page | 173 174 175 176 177 178 179 180 181 182 183 184  | Next Page >