Search Results

Search found 1046 results on 42 pages for 'forth'.

Page 27/42 | < Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >

  • Repeated installation of malicious software to do outbound DDOS attack [duplicate]

    - by user224294
    This question already has an answer here: How do I deal with a compromised server? 12 answers We have a Ubuntu Vitual Private Server hosted by a Canadian company. Out VPS was affected to do "outbound DDOS attack" as reported by server security team. There are 4 files in /boot looks like iptable, please note that the capital letter "I","L". VPS:/boot# ls -lha total 1.8M drwx------ 2 root root 4.0K Jun 3 09:25 . drwxr-xr-x 22 root root 4.0K Jun 3 09:25 .. -r----x--x 1 root root 1.1M Jun 3 09:25 .IptabLes -r----x--x 1 root root 706K Jun 3 09:23 .IptabLex -r----x--x 1 root root 33 Jun 3 09:25 IptabLes -r----x--x 1 root root 33 Jun 3 09:23 IptabLex We deleted them. But after a few hours, they appeared again and the attack resumed. We deleted them again. They resurfaced again. So on and so forth. So finally we have to disable our VPS. Please let us know how can we find the malicious script somewhere in the VPS, which can automatically install such attcking software? Thanks.

    Read the article

  • Should the MAC Tables on a switch Stack be the same between sessions?

    - by Kyle Brandt
    According to Cisco's documentation: "The MAC address tables on all stack members are synchronized. At any given time, each stack member has the same copy of the address tables for each VLAN." However, when logged into the switch I see the following: ny-swstack01#show mac ad | inc Total Total Mac Addresses for this criterion: 222 ny-swstack01#ses 2 ny-swstack01-2#show mac ad | inc Total Total Mac Addresses for this criterion: 229 ny-swstack01-2#exit ny-swstack01#ses 3 ny-swstack01-3#show mac ad | inc Total Total Mac Addresses for this criterion: 229 ny-swstack01-3#exit ny-swstack01#ses 4 ny-swstack01-4#show mac ad | inc Total Total Mac Addresses for this criterion: 235 ny-swstack01-4#exit ny-swstack01#show mac ad | inc Total Total Mac Addresses for this criterion: 222 Going back and forth this isn't just because it is changing over time either, within certain sessions there are entries that I don't seen from the master session. We are currently waiting to hear back on CIsco from this, but has anyone run into this before? I stumbled upon this when looking into Unicast flooding, one of the hosts that is a destination MAC of flooding has a MAC entry that appears in session 3, but nowhere else. Also, I checked an all sessions show the same aging time.

    Read the article

  • Designing a persistent asynchronous TCP protocol

    - by dogglebones
    I have got a collection of web sites that need to send time-sensitive messages to host machines all over my metro area, each on its own generally dynamic IP. Until now, I have been doing this the way of the script kiddie: Each host machine runs an (s)FTP server, or an HTTP(s) server, and correspondingly has a certain port opened up by its gateway. Each host machine runs a program that watches a certain folder and automatically opens or prints or exec()s when a new file of a given extension shows up. Dynamic IP addresses are accommodated using a dynamic DNS service. Each web site does cURL or fsockopen or whatever and communicates directly with its recipient as-needed. This approach has been suprisingly reliable, however obvious issues have come up and the situation needs to be addressed. As stated, these messages are time-sensitive and failures need to be detected within minutes of submission by end-users. What I'm doing is building a messaging protocol. It will run on a machine and connection in my control. As far as the service is concerned, there is no distinction between web site and host machine -- there is only one device sending a message to another device. So that's where I'm at right now. I've got a skeleton server and a skeleton client. They can negotiate high-quality authentication and encryption. The (TCP) connection is persistent and asynchronous, and can handle delimited (i.e., read until \r\n or whatever) as well as length-prefixed (i.e., read exactly n bytes) messages. Unless somebody gives me a better idea, I think I'll handle messages as byte arrays. So I'm looking for suggestions on how to model the protocol itself -- at the application level. I'll mostly be transferring XML and DLM type files, as well as control messages for things like "handshake" and "is so-and-so online?" and so forth. Is there anything really stupid in my train of thought? Or anything I should read about before I get started? Stuff like that -- please and thanks.

    Read the article

  • System Issues and Major Malfuctions after Failed hibernation Exit

    - by Sarah Seguin
    I have a HP G71-340US that went into hibernation mode for a while and when I tried coming out of it, I got an error message: You're computer cannot come out if hibernation . Status: 0xc000009a Info: A fatal error occurred processing the restoration data. File: \hiberfil.sys Any information that was not saved before the computer went into hybernation will be lost enter=continue So I hit continue and it ran soooo super slow it. It was seriously crawling. Finally I gave up and turned it off manually (IE press and hold the button). It's been a week or two since then and EVERY SINGLE TIME I have tried to to do ANYTHING it takes forever. When I say forever, I literally mean takes 5-7 minutes to load the internet, then the page itself, then to click a link, so on so forth. Eventually everything just goes not responding and I have to give up (4-6 HOURS later). I also cannot access my thumb/jump drives once I've managed to load windows. I was going to try runing malware bytes incase of a virus, but it's windows explorer developes errors and goes not responding on me. Currently I'm running scan disk or check disk and like every file is coming back unreadable. I let it run the last 2 hours straight in chkdesk and I'm only at 6 percent with around 500+ errors and still going. Yes, I've taken logs of the errors via cell phone camera and patience. A week or two prior to this happening I had to change our the hard drive due to blunt force trama next to the mouse. OH! Running on Windows 7: ) And I've tried loading the computer in safe mode and it makes absolutely no difference. Any and all help would be appreciated. I really don't know what to do from here and I'm kind of freaking out. I've googled different part of the error and things that I've done/seen and there are so many different answers/topics that I thought it best to just post the questions.

    Read the article

  • Understanding MySQL Query Caches and when to implement it?

    - by Jeff
    On our current MySQL server query cache is enabled. Qchache_hits: 31913 Qchache_inserts: 50959 Qchache_lowmem_prunes: 9320 Qchache_not_chached: 209320 Qchache_queries_in_chace: 986 com_update: 0 com_delete: 0 I do not fully understand the Query cache - I am reading about it currently and trying to understand it. Our database holds inventory data, customer data, employee data, sales data and so forth. The query is very rarely run more than once. The possibility of a query being run twice is viewing a specific sales information twice. But basically everything in our system changes constantly. It is always being updated, deleted, insterted and off the top of my head I can't picture users running the same query twice within a week. Do I even need to have the query cache enabled? I am guessing that the inserts means 51k entries have been added, but only 986 of those are being stored? Would an idea be to refresh the cache, and watch it for a week and check how many of the queries in cached are accessed maybe on a weekly basis to see if it is actually returning any benefits? Any help/guidance on this is appreciated, thanks

    Read the article

  • how to export VARs from a subshell to a parent shell?

    - by webwesen
    I have a Korn shell script #!/bin/ksh # set the right ENV case $INPUT in abc) export BIN=${ABC_BIN} ;; def) export BIN=${DEF_BIN} ;; *) export BIN=${BASE_BIN} ;; esac # exit 0 <- bad idea for sourcing the file now these VARs are export'ed only in a subshell, but I want them to be set in my parent shell as well, so when I am at the prompt those vars are still set correctly. I know about . .myscript.sh but is there a way to do it without 'sourcing'? as my users often forget to 'source'. EDIT1: removing the "exit 0" part - this was just me typing without thinking first EDIT2: to add more detail on why do i need this: my developers write code for (for simplicity sake) 2 apps : ABC & DEF. every app is run in production by separate users usrabc and usrdef, hence have setup their $BIN, $CFG, $ORA_HOME, whatever - specific to their apps. so ABC's $BIN = /opt/abc/bin # $ABC_BIN in the above script DEF's $BIN = /opt/def/bin # $DEF_BIN etc. now, on the dev box developers can develop both ABC and DEF at the same time under their own user account 'justin_case', and I make them source the file (above) so that they can switch their ENV var settings back and forth. ($BIN should point to $ABC_BIN at one time and then I need to switch to $BIN=$DEF_BIN) now, the script should also create new sandboxes for parallel development of the same app, etc. this makes me to do it interactively, asking for sandbox name, etc. /home/justin_case/sandbox_abc_beta2 /home/justin_case/sandbox_abc_r1 /home/justin_case/sandbox_def_r1 the other option i have considered is writing aliases and add them to every users' profile alias 'setup_env=. .myscript.sh' and run it with setup_env parameter1 ... parameterX this makes more sense to me now

    Read the article

  • How do I disable DirectDraw and Direct3D acceleration on Windows 8? [closed]

    - by Favourite Chigozie Onwuemene
    Some old games that i would really like to play run slowly on some graphic drivers when direct3d acceleration is enabled. I have tried many suggestions but none seems to work. The only thing i have not tried is disabling direct3d acceleration. Is it possible to disable DirectDraw and Direct3D acceleration on my Windows 8 pc? There are certain bad versions of GeForce drivers that cause this problem. This is a problem in the drivers themselves and is unfortunately completely outside our control. The recommended way to fix this problem is to update your graphics card drivers (go to NVIDIA's web site for this). Alternatively, there is a workaround that alleviates or solves the slowdown problem altogether. Try this: Right-click on your desktop and select "Properties". Go to the "Settings" tab and click on "Advanced...". Click on the "Troubleshooting" tab and move the slider to the left until it says that all DirectDraw and Direct3D accelerations have been disabled (around the middle of the range). Finally, click on "OK". Note that this workaround might cause other games on your computer to slow down, so you may have to switch back and forth between settings, but it's certainly worth a try if you can't obtain an updated graphics driver. -source: interactionstudios

    Read the article

  • Handling WCF Service Paths in Silverlight 4 – Relative Path Support

    - by dwahlin
    If you’re building Silverlight applications that consume data then you’re probably making calls to Web Services. We’ve been successfully using WCF along with Silverlight for several client Line of Business (LOB) applications and passing a lot of data back and forth. Due to the pain involved with updating the ServiceReferences.ClientConfig file generated by a Silverlight service proxy (see Tim Heuer’s post on that subject to see different ways to deal with it) we’ve been using our own technique to figure out the service URL. Going that route makes it a peace of cake to switch between development, staging and production environments. To start, we have a ServiceProxyBase class that handles identifying the URL to use based on the XAP file’s location (this assumes that the service is in the same Web project that serves up the XAP file). The GetServiceUrlBase() method handles this work: public class ServiceProxyBase { public ServiceProxyBase() { if (!IsDesignTime) { ServiceUrlBase = GetServiceUrlBase(); } } public string ServiceUrlBase { get; set; } public static bool IsDesignTime { get { return (Application.Current == null) || (Application.Current.GetType() == typeof (Application)); } } public static string GetServiceUrlBase() { if (!IsDesignTime) { string url = Application.Current.Host.Source.OriginalString; return url.Substring(0, url.IndexOf("/ClientBin", StringComparison.InvariantCultureIgnoreCase)); } return null; } } Silverlight 4 now supports relative paths to services which greatly simplifies things.  We changed the code above to the following: public class ServiceProxyBase { private const string ServiceUrlPath = "../Services/JobPlanService.svc"; public ServiceProxyBase() { if (!IsDesignTime) { ServiceUrl = ServiceUrlPath; } } public string ServiceUrl { get; set; } public static bool IsDesignTime { get { return (Application.Current == null) || (Application.Current.GetType() == typeof (Application)); } } public static string GetServiceUrl() { if (!IsDesignTime) { return ServiceUrlPath; } return null; } } Our ServiceProxy class derives from ServiceProxyBase and handles creating the ABC’s (Address, Binding, Contract) needed for a WCF service call. Looking through the code (mainly the constructor) you’ll notice that the service URI is created by supplying the base path to the XAP file along with the relative path defined in ServiceProxyBase:   public class ServiceProxy : ServiceProxyBase, IServiceProxy { private const string CompletedEventargs = "CompletedEventArgs"; private const string Completed = "Completed"; private const string Async = "Async"; private readonly CustomBinding _Binding; private readonly EndpointAddress _EndPointAddress; private readonly Uri _ServiceUri; private readonly Type _ProxyType = typeof(JobPlanServiceClient); public ServiceProxy() { _ServiceUri = new Uri(Application.Current.Host.Source, ServiceUrl); var elements = new BindingElementCollection { new BinaryMessageEncodingBindingElement(), new HttpTransportBindingElement { MaxBufferSize = 2147483647, MaxReceivedMessageSize = 2147483647 } }; // order of entries in collection is significant: dumb _Binding = new CustomBinding(elements); _EndPointAddress = new EndpointAddress(_ServiceUri); } #region IServiceProxy Members /// <summary> /// Used to call a WCF service operation. /// </summary> /// <typeparam name="T">The type of EventArgs that will be returned by the service operation.</typeparam> /// <param name="callback">The method to call once the WCF call returns (the callback).</param> /// <param name="parameters">Any parameters that the service operation expects.</param> public void CallService<T>(EventHandler<T> callback, params object[] parameters) where T : EventArgs { try { var proxy = new JobPlanServiceClient(_Binding, _EndPointAddress); string action = typeof (T).Name.Replace(CompletedEventargs, String.Empty); _ProxyType.GetEvent(action + Completed).AddEventHandler(proxy, callback); _ProxyType.InvokeMember(action + Async, BindingFlags.InvokeMethod, null, proxy, parameters); } catch (Exception exp) { MessageBox.Show("Unable to use ServiceProxy.CallService to retrieve data: " + exp.Message); } } #endregion } The relative path support for calling services in Silverlight 4 definitely simplifies code and is yet another good reason to move from Silverlight 3 to Silverlight 4.   For more information about onsite, online and video training, mentoring and consulting solutions for .NET, SharePoint or Silverlight please visit http://www.thewahlingroup.com.

    Read the article

  • TestDriven.Net 3.0 – All Systems Go

    - by Jamie Cansdale
    I’m pleased to announce that TestDriven.Net 3.0 is now available. Finally! I know many of you will already be using the Beta and RC versions, but if you look at the release notes you’ll see there’s been many refinements since then, so I highly recommend you install the RTM version. Here is a quick summary of a few new features: Visual Studio 2010 supports targeting multiple versions of the .NET framework (multi-targeting). This means you can easily upgrade your Visual Studio 2005/2008 solutions without necessarily converting them to use .NET 4.0. TestDriven.Net will execute your tests using the .NET version your test project is targeting (see ‘Properties > Application > Target framework’). There is now first class support for MSTest when using Visual Studio 2008 & 2010. Previous versions of TestDriven.Net had support for a limited number of MSTest attributes. This version supports virtually all MSTest unit testing related attributes, including support for deployment item and data driven test attributes. You should also find this test runner is quick. ;) There is a new ‘Go To Test/Code’ command on the code context menu. You can think of this as Ctrl-Tab for test driven developers; it will quickly flip back and forth between your tests and code under test. I recommend assigning a keyboard shortcut to the ‘TestDriven.NET.GoToTestOrCode’ command. NCover can now be used for code coverage on .NET 4.0. This is only officially supported since NCover 3.2 (your mileage may vary if you’re using the 1.5.8 version). Rather than clutter the ‘Output’ window, ignored or skipped tests will be placed on the ‘Task List’. You can double-click on these items to navigate to the offending test (or assign a keyboard shortcut to ‘View.NextTask’). If you’re using a Team, Premium or Ultimate edition of Visual Studio 2005-2010, a new ‘Test With > Performance’ command will be available. This command will perform instrumented performance profiling on your target code. A particular focus of this version has been to make it more keyboard friendly. Here’s a list of commands you will probably want to assign keyboard shortcuts to: Name Default What I use TestDriven.NET.RunTests Run tests in context   Alt + T TestDriven.NET.RerunTests Repeat test run   Alt + R TestDriven.NET.GoToTestOrCode Flip between tests and code   Alt + G TestDriven.NET.Debugger Run tests with debugger   Alt + D View.Output Show the ‘Output’ window Ctrl+ Alt + O   Edit.BreakLine Edit code in stack trace Enter   View.NextError Jump to next failed test Ctrl + Shift + F12   View.NextTask Jump to next skipped test   Alt + S   By default the ‘Output’ window will automatically activate when there is test output or a failed test (this is an option). The cursor will be positioned on the stack trace of the last failed test, ready for you to hit ‘Enter’ to jump to the fail point or ‘Esc’ to return to your source (assuming your ‘Output’ window is set to auto-hide).  If your ‘Output’ window isn’t set to auto-hide, you’ll need to hit ‘Ctrl + Alt + O’ then ‘Enter’. Alternatively you can use ‘Ctrl + Shift + F12’ (View.NextError) to navigate between all failed tests.   For more frequent updates or to give feedback, you can find me on twitter here. I hope you enjoy this version. Let me know how you get on. :)

    Read the article

  • Write, Read and Update Oracle CLOBs with PL/SQL

    - by robertphyatt
    Fun with CLOBS! If you are using Oracle, if you have to deal with text that is over 4000 bytes, you will probably find yourself dealing with CLOBs, which can go up to 4GB. They are pretty tricky, and it took me a long time to figure out these lessons learned. I hope they will help some down-trodden developer out there somehow. Here is my original code, which worked great on my Oracle Express Edition: (for all examples, the first one writes a new CLOB, the next one Updates an existing CLOB and the final one reads a CLOB back) CREATE OR REPLACE PROCEDURE PRC_WR_CLOB (        p_document      IN VARCHAR2,        p_id            OUT NUMBER) IS      lob_loc CLOB; BEGIN    INSERT INTO TBL_CLOBHOLDERDDOC (CLOBHOLDERDDOC)        VALUES (empty_CLOB())        RETURNING CLOBHOLDERDDOC, CLOBHOLDERDDOCID INTO lob_loc, p_id;    DBMS_LOB.WRITE(lob_loc, LENGTH(UTL_RAW.CAST_TO_RAW(p_document)), 1, UTL_RAW.CAST_TO_RAW(p_document)); END; / CREATE OR REPLACE PROCEDURE PRC_UD_CLOB (        p_document      IN VARCHAR2,        p_id            IN NUMBER) IS      lob_loc CLOB; BEGIN        SELECT CLOBHOLDERDDOC INTO lob_loc FROM TBL_CLOBHOLDERDDOC        WHERE CLOBHOLDERDDOCID = p_id FOR UPDATE;    DBMS_LOB.WRITE(lob_loc, LENGTH(UTL_RAW.CAST_TO_RAW(p_document)), 1, UTL_RAW.CAST_TO_RAW(p_document)); END; / CREATE OR REPLACE PROCEDURE PRC_RD_CLOB (    p_id IN NUMBER,    p_clob OUT VARCHAR2) IS    lob_loc  CLOB; BEGIN    SELECT CLOBHOLDERDDOC INTO lob_loc    FROM   TBL_CLOBHOLDERDDOC    WHERE  CLOBHOLDERDDOCID = p_id;    p_clob := UTL_RAW.CAST_TO_VARCHAR2(DBMS_LOB.SUBSTR(lob_loc, DBMS_LOB.GETLENGTH(lob_loc), 1)); END; / As you can see, I had originally been casting everything back and forth between RAW formats using the UTL_RAW.CAST_TO_VARCHAR2() and UTL_RAW.CAST_TO_RAW() functions all over the place, but it had the nasty side effect of working great on my Oracle express edition on my developer box, but having all the CLOBs above a certain size display garbage when read back on the Oracle test database server . So...I kept working at it and came up with the following, which ALSO worked on my Oracle Express Edition on my developer box:   CREATE OR REPLACE PROCEDURE PRC_WR_CLOB (     p_document      IN VARCHAR2,     p_id        OUT NUMBER) IS       lob_loc CLOB; BEGIN     INSERT INTO TBL_CLOBHOLDERDOC (CLOBHOLDERDOC)         VALUES (empty_CLOB())         RETURNING CLOBHOLDERDOC, CLOBHOLDERDOCID INTO lob_loc, p_id;     DBMS_LOB.WRITE(lob_loc, LENGTH(p_document), 1, p_document);   END; / CREATE OR REPLACE PROCEDURE PRC_UD_CLOB (     p_document      IN VARCHAR2,     p_id        IN NUMBER) IS       lob_loc CLOB; BEGIN     SELECT CLOBHOLDERDOC INTO lob_loc FROM TBL_CLOBHOLDERDOC     WHERE CLOBHOLDERDOCID = p_id FOR UPDATE;     DBMS_LOB.WRITE(lob_loc, LENGTH(p_document), 1, p_document); END; / CREATE OR REPLACE PROCEDURE PRC_RD_CLOB (     p_id IN NUMBER,     p_clob OUT VARCHAR2) IS     lob_loc  CLOB; BEGIN     SELECT CLOBHOLDERDOC INTO lob_loc     FROM   TBL_CLOBHOLDERDOC     WHERE  CLOBHOLDERDOCID = p_id;     p_clob := DBMS_LOB.SUBSTR(lob_loc, DBMS_LOB.GETLENGTH(lob_loc), 1); END; / Unfortunately, by changing my code to what you see above, even though it kept working on my Oracle express edition, everything over a certain size just started truncating after about 7950 characters on the test server! Here is what I came up with in the end, which is actually the simplest solution and this time worked on both my express edition and on the database server (note that only the read function was changed to fix the truncation issue, and that I had Oracle worry about converting the CLOB into a VARCHAR2 internally): CREATE OR REPLACE PROCEDURE PRC_WR_CLOB (        p_document      IN VARCHAR2,        p_id            OUT NUMBER) IS      lob_loc CLOB; BEGIN    INSERT INTO TBL_CLOBHOLDERDDOC (CLOBHOLDERDDOC)        VALUES (empty_CLOB())        RETURNING CLOBHOLDERDDOC, CLOBHOLDERDDOCID INTO lob_loc, p_id;    DBMS_LOB.WRITE(lob_loc, LENGTH(p_document), 1, p_document); END; / CREATE OR REPLACE PROCEDURE PRC_UD_CLOB (        p_document      IN VARCHAR2,        p_id            IN NUMBER) IS      lob_loc CLOB; BEGIN        SELECT CLOBHOLDERDDOC INTO lob_loc FROM TBL_CLOBHOLDERDDOC        WHERE CLOBHOLDERDDOCID = p_id FOR UPDATE;    DBMS_LOB.WRITE(lob_loc, LENGTH(p_document), 1, p_document); END; / CREATE OR REPLACE PROCEDURE PRC_RD_CLOB (    p_id IN NUMBER,    p_clob OUT VARCHAR2) IS BEGIN    SELECT CLOBHOLDERDDOC INTO p_clob    FROM   TBL_CLOBHOLDERDDOC    WHERE  CLOBHOLDERDDOCID = p_id; END; /   I hope that is useful to someone!

    Read the article

  • Apply Skins to Add Some Flair to Windows Media Player 12

    - by DigitalGeekery
    Tired of the same look and feel of Windows Media Player in Windows 7? We’ll show you how to inject new life into your media experience by applying skins in WMP 12. Adding Skins In Library view, click on View from the Menu and select Skin Chooser. By default, WMP 12 comes with only a couple of modest skins. When you select a skin from the left pane, a preview will be displayed to the right. To apply one of the skins, simply select it from the pane on the left and click Apply Skin.   You can also switch to the currently selected skin in the Skin chooser by selecting Skin from the View menu, or by pressing Crtl + 2. Media Player will open in Now Playing mode. Click on the Switch to Library button at the top left to return to Library view.     Ok, so the included skins are a little boring. You can find additional skins by selecting Tools > Download > Skins.   Or, by clicking on More Skins from within the Skin chooser.   You will be taken the the Microsoft website where you can choose from dozens of skins to download and install. Select a skin you’d like to try and click the link to download.   If prompted with a warning message about files containing scripts that access your library, click Yes. Note: These warning boxes may look a bit different depending on your browser. We are using Chrome for this example.   Click on View Now.   Your new skin will be on display. To get back to the Library mode, find and click the Return to Full Mode button.    Some skins may launch video in a separate window.   If you want to delete one of the skins, select it from the list within the Skin chooser and click the red “X.” You can also press the delete key on your keyboard.   Then click Yes to confirm.   Conclusion Using skins is a quick and easy way to add some style to Windows Media Player and switching back and forth between skins is a breeze. Regardless of your interests, you are sure to find a skin that fits your tastes. You may find WMP skins on other sites, but sticking with Microsoft’s website will ensure maximum compatibility. Skins for Windows Media Player Similar Articles Productive Geek Tips Make VLC Player Look like Windows Media Player 10Make VLC Player Look like Windows Media Player 11Make VLC Player Look like Winamp 5 (Kinda)Fixing When Windows Media Player Library Won’t Let You Add FilesInstall and Use the VLC Media Player on Ubuntu Linux TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips VMware Workstation 7 Acronis Online Backup DVDFab 6 Revo Uninstaller Pro Use Flixtime To Create Video Slideshows Creating a Password Reset Disk in Windows Bypass Waiting Time On Customer Service Calls With Lucyphone MELTUP – "The Beginning Of US Currency Crisis And Hyperinflation" Enable or Disable the Task Manager Using TaskMgrED Explorer++ is a Worthy Windows Explorer Alternative

    Read the article

  • Silverlight Cream for March 26, 2010 -- #821

    - by Dave Campbell
    In this Issue: Max Paulousky, Christian Schormann, John Papa, Phani Raj, David Anson(-2-, -3-), Brad Abrams(-2-), and Jeff Wilcox(-2-, -3-). Shoutouts: Jeff Wilcox posted his material from mix and some preview TestFramework bits: Unit Testing Silverlight & Windows Phone Applications – talk now online At MIX10, Jeff Wilcox demo'd an app called "Peppermint"... here's the bleeding edge demo: “Peppermint” MIX demo sources Erik Mork and Co. have put out their weekly This Week In Silverlight 3.25.2010 Brad Abrams has all his materials posted for his MIX10 session Mix2010: Search Engine Optimization (SEO) for Microsoft Silverlight... including play-by-play of the demo and all source. Do you use Rooler? Well you should! Watch a video Pete Brown did with Pete Blois on Expression Blend, Windows Phone, Rooler Interested in Silverlight and XNA for WP7? Me too! Michael Klucher has a post outlining the two: Silverlight and XNA Framework Game Development and Compatibility From SilverlightCream.com: Modularity in Silverlight Applications - An Issue With ModuleInitializeException Max Paulousky has a truly ugly error trace listed by way of not having a reference listed, and the obvious simple solution. Next time he'll talk about the difficult situations. Using SketchFlow to Prototype for Windows Phone Christian Schormann has a tutorial up on using Expression Blend to develop for WP7 ... who better than Christian for that task?? Silverlight TV 18: WCF RIA Services Validation John Papa held forth with Nikhil Kothari on WCF RIA Services and validation just prior to MIX10, and was posted yesterday. Building SL3 applications using OData client Library with Vs 2010 RC Phani Raj walks through building an OData consumer in SL3, the first problem you're going to hit, and the easy solution to it. Tip: When creating a DependencyProperty, follow the handy convention of "wrapper+register+static+virtual" David Anson has a couple more of his 'Tips' up... this first is about Dependency Properties again... having a good foundation for all your Dependency Properties is a great way to avoid problems. Tip: Do not assign DependencyProperty values in a constructor; it prevents users from overriding them In the next post, David Anson talks about not assigning Dependency Property values in a constructor and gives one of the two ways to get around doing so. Tip: Set DependencyProperty default values in a class's default style if it's more convenient In his latest post, David Anson gives the second way to avoid setting a Dependency Property value in the constructor. Silverlight 4 + RIA Services - Ready for Business: Search Engine Optimization (SEO) Brad Abrams Abrams adds SEO to the tutorial series he's doing. He begins with his PDC09 session material on the subject and then takes off on a great detailed tutorial all with source. Silverlight 4 + RIA Services - Ready for Business: Localizing Business Application Brad Abrams then discusses localization and Silverlight in another detailed tutorial with all code included. Silverlight Toolkit and the Windows Phone: WrapPanel, and a few others Jeff Wilcox has a few WP7 posts I'm going to push today. This first is from earlier this week and is about using the Toolkit in WP7 and better than that, he includes the bits you need if all you want is the WrapPanel Data binding user settings in Windows Phone applications In the next one from yesterday, Jeff Wilcox demonstrates saving some user info in Isolated Storage to improve the user experience, and shares all the necessary plumbing files, and other external links as well. Displaying 2D QR barcodes in Windows Phone applications In a post from today, Jeff Wilcox ported his Silverlight 2D QR Barcode app from last year into WP7 ... just very cool... get the source and display your Microsoft Tag. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone    MIX10

    Read the article

  • Keep a Window on top with a handy AutoHotkey script

    - by Matthew Guay
    Are you tired of shuffling back and forth between windows to get your work done?  Here’s a handy tool that lets you keep any window always on top when you need it. There are many ways to use multiple windows efficiently, but sometimes it seems you need to keep a smaller one in front of a larger window and they never quite fit right.  Whether you’re trying to use Calculator and a web form at the same time, or see what music is playing while you’re catching up on your news, there’s many scenarios where it can be useful to keep one window always on top.  There are many utilities to do this, but they are often needlessly complicated and bloated.  Here we look at a better solution from Amit, our friend at Digital Inspiration. Always on Top Thanks to AutoHotkey, you can easily always keep any window on top of all the others on your screen.  You can download this as a small exe and run it directly, or can create it with a simple script in AutoHotkey.  For simplicity, we simply downloaded the application and ran it directly. To do this, download Always on Top (link below), and unzip the file. Once you’ve launched it, simply select the window you want to keep on top and press Ctrl+Space.  This program will now stay in front, even when it is not the active window.  Here’s a screenshot of a Hotmail signup dialog in Chrome with Notepad kept on top.  Notice Notepad isn’t the active application, but it is still on top. If you wish to un-pin the window from being on top, simply select the window and press Ctrl+space again.  You can keep multiple windows pinned at once, too, though you may clutter your desktop quickly! Always on Top will keep running in your system tray, and you can exit or suspend it by right-clicking on its tray icon and selecting exit or suspend, respectively. Create Your Own Always on Top Utility with AutoHotkey If you’re a fan of AutoHotkey, you can create your own AutoHotkey script to keep windows on top simply and easily with only one line of code: ^SPACE:: Winset, Alwaysontop, , A Simply create a new file, insert the code, and save it as plaintext with the .ahk file extension.  If you have AutoHotkey installed, simply double-click this file for the exact same functionality as the premade version. Conclusion This is a great way to keep a window handy, and it can be beneficial in many scenarios.  For instance you can use it to copy data from a PDF or image into a form or spreadsheet, and it saves a lot of clicks and time.  Links: Download Always on Top from Digital Inspiration Download AutoHotkey if you want to make it yourself Similar Articles Productive Geek Tips Get the Linux Alt+Window Drag Functionality in WindowsGet Mac’s Hide Others (cmd+opt+H) Keyboard Shortcut for WindowsAdd "Run as Administrator" for AutoHotkey Scripts in Windows 7 or VistaKeyboard Ninja: Pop Up the Vista Calendar with a Single HotkeyKeyboard Ninja: Assign a Hotkey to any Window TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional OutSync will Sync Photos of your Friends on Facebook and Outlook Windows 7 Easter Theme YoWindoW, a real time weather screensaver Optimize your computer the Microsoft way Stormpulse provides slick, real time weather data Geek Parents – Did you try Parental Controls in Windows 7?

    Read the article

  • Visual Studio 2010 Beta 2 Startup Failures

    - by Rick Strahl
    I’ve been working with VS 2010 Beta 2 for a while now and while it works Ok most of the time it seems the environment is very, very fragile when it comes to crashes and installed packages. Specifically I’ve been working just fine for days, then when VS 2010 crashes it will not re-start. Instead I get the good old Application cannot start dialog: Other failures I’ve seen bring forth other just as useful dialogs with information overload like Operation cannot be performed which for me specifically happens when trying to compile any project. After a bit of digging around and a post to Microsoft Connect the solution boils down to resetting the VS.NET environment. The Application Cannot Start issue stems from a package load failure of some sort, so the work around for this is typically: c:\program files\Visual Studio 2010\Common7\IDE\devenv.exe /ResetSkipPkgs In most cases that should do the trick. If it doesn’t and the error doesn’t go away the more drastic: c:\program files\Visual Studio 2010\Common7\IDE\devenv.exe /ResetSettings is required which resets all settings in VS to its installation defaults. Between these two I’ve always been able to get VS to startup and run properly. BTW it’s handy to keep a list of command line options for Visual Studio around: http://msdn.microsoft.com/en-us/library/xee0c8y7%28VS.100%29.aspx Note that the /? option in VS 2010 doesn’t display all the options available but rather displays the ‘demo version’ message instead, so the above should be helpful. Also note that unless you install Visual C++ the Visual Studio Command Prompt icon is not automatically installed so you may have to navigate manually to the appropriate folder above. Cannot Build Failures If you get the Cannot compile error dialog, there is another thing that have worked for me: Change your project build target from Debug to Release (or whatever – just change it) and compile again. If that doesn’t work doing the reset steps above will do it for me. It appears this failure comes from some sort of interference of other versions of Visual Studio installed on the system and running another version first. Resetting the build target explicitly seems to reset the build providers to a normalized state so that things work in many cases. But not all. Worst case – resetting settings will do it. The bottom line for working in VS 2010 has been – don’t get too attached to your custom settings as they will get blown away quite a bit. I’ve probably been through 20 or more of these VS resets although I’ve been working with it quite a bit on an internal project. It’s kind of frustrating to see this kind of high level instability in a Beta 2 product which is supposedly the last public beta they will put out. On the other hand this beta has been otherwise rather stable and performance is roughly equivalent to VS 2008. Although I mention the crash above – crashes I’ve seen have been relatively rare and no more frequent than in VS 2008 it seems. Given the drastic UI changes in VS 2010 (using WPF for the shell and editor) I’m actually impressed that the product is as stable as it is at this point. Also I was seriously worried about text quality going to a WPF model, but thankfully WPF 4.0 addresses the blurry text issue with native font rendering to render text on non-cleartype enabled systems crisply. Anyway I hope that these notes are helpful to some of you playing around with the beta and running into problems. Hopefully you won’t need them :-}© Rick Strahl, West Wind Technologies, 2005-2010

    Read the article

  • Converting Openfire IM datetime values in SQL Server to / from VARCHAR(15) and DATETIME data types

    - by Brian Biales
    A client is using Openfire IM for their users, and would like some custom queries to audit user conversations (which are stored by Openfire in tables in the SQL Server database). Because Openfire supports multiple database servers and multiple platforms, the designers chose to store all date/time stamps in the database as 15 character strings, which get converted to Java Date objects in their code (Openfire is written in Java).  I did some digging around, and, so I don't forget and in case someone else will find this useful, I will put the simple algorithms here for converting back and forth between SQL DATETIME and the Java string representation. The Java string representation is the number of milliseconds since 1/1/1970.  SQL Server's DATETIME is actually represented as a float, the value being the number of days since 1/1/1900, the portion after the decimal point representing the hours/minutes/seconds/milliseconds... as a fractional part of a day.  Try this and you will see this is true:     SELECT CAST(0 AS DATETIME) and you will see it returns the date 1/1/1900. The difference in days between SQL Server's 0 date of 1/1/1900 and the Java representation's 0 date of 1/1/1970 is found easily using the following SQL:   SELECT DATEDIFF(D, '1900-01-01', '1970-01-01') which returns 25567.  There are 25567 days between these dates. So to convert from the Java string to SQL Server's date time, we need to convert the number of milliseconds to a floating point representation of the number of days since 1/1/1970, then add the 25567 to change this to the number of days since 1/1/1900.  To convert to days, you need to divide the number by 1000 ms/s, then by  60 seconds/minute, then by 60 minutes/hour, then by 24 hours/day.  Or simply divide by 1000*60*60*24, or 86400000.   So, to summarize, we need to cast this string as a float, divide by 86400000 milliseconds/day, then add 25567 days, and cast the resulting value to a DateTime.  Here is an example:   DECLARE @tmp as VARCHAR(15)   SET @tmp = '1268231722123'   SELECT @tmp as JavaTime, CAST((CAST(@tmp AS FLOAT) / 86400000) + 25567 AS DATETIME) as SQLTime   To convert from SQL datetime back to the Java time format is not quite as simple, I found, because floats of that size do not convert nicely to strings, they end up in scientific notation using the CONVERT function or CAST function.  But I found a couple ways around that problem. You can convert a date to the number of  seconds since 1/1/1970 very easily using the DATEDIFF function, as this value fits in an Int.  If you don't need to worry about the milliseconds, simply cast this integer as a string, and then concatenate '000' at the end, essentially multiplying this number by 1000, and making it milliseconds since 1/1/1970.  If, however, you do care about the milliseconds, you will need to use DATEPART to get the milliseconds part of the date, cast this integer to a string, and then pad zeros on the left to make sure this is three digits, and concatenate these three digits to the number of seconds string above.  And finally, I discovered by casting to DECIMAL(15,0) then to VARCHAR(15), I avoid the scientific notation issue.  So here are all my examples, pick the one you like best... First, here is the simple approach if you don't care about the milliseconds:   DECLARE @tmp as VARCHAR(15)   DECLARE @dt as DATETIME   SET @dt = '2010-03-10 14:35:22.123'   SET @tmp = CAST(DATEDIFF(s, '1970-01-01 00:00:00' , @dt) AS VARCHAR(15)) + '000'   SELECT @tmp as JavaTime, @dt as SQLTime If you want to keep the milliseconds:   DECLARE @tmp as VARCHAR(15)   DECLARE @dt as DATETIME   DECLARE @ms as int   SET @dt = '2010-03-10 14:35:22.123'   SET @ms as DATEPART(ms, @dt)   SET @tmp = CAST(DATEDIFF(s, '1970-01-01 00:00:00' , @dt) AS VARCHAR(15))           + RIGHT('000' + CAST(@ms AS VARCHAR(3)), 3)   SELECT @tmp as JavaTime, @dt as SQLTime Or, in one fell swoop:   DECLARE @dt as DATETIME   SET @dt = '2010-03-10 14:35:22.123'   SELECT @dt as SQLTime     , CAST(DATEDIFF(s, '1970-01-01 00:00:00' , @dt) AS VARCHAR(15))           + RIGHT('000' + CAST( DATEPART(ms, @dt) AS VARCHAR(3)), 3) as JavaTime   And finally, a way to simply reverse the math used converting from Java date to SQL date. Note the parenthesis - watch out for operator precedence, you want to subtract, then multiply:   DECLARE @dt as DATETIME   SET @dt = '2010-03-10 14:35:22.123'   SELECT @dt as SQLTime     , CAST(CAST((CAST(@dt as Float) - 25567.0) * 86400000.0 as DECIMAL(15,0)) as VARCHAR(15)) as JavaTime Interestingly, I found that converting to SQL Date time can lose some accuracy, when I converted the time above to Java time then converted  that back to DateTime, the number of milliseconds is 120, not 123.  As I am not interested in the milliseconds, this is ok for me.  But you may want to look into using DateTime2 in SQL Server 2008 for more accuracy.

    Read the article

  • Christian Radio Locator iPhone app

    - by Tim Hibbard
    For the last three months or so I've been working on an iPhone (and iPad) app in my spare time. It all started when I took the kids to Minneapolis and had a hard time finding radio stations to listen to on the trip. I looked in the App Store for an app that would use my GPS to show me Christian radio stations nearby, but there wasn't one. So I decided to build my own. Using public information from the FCC and a few other sources, I built a database in Google docs that contains the frequency for all Christian radio stations, where the tower is located and how far the tower can reach. I also included any streaming audio information and other contact information like Facebook or Twitter that I could find. Google spreadsheets publish in JSON format (yes, really) and Xcode can automatically deserialize JSON into a properly formatted entity. This is one area that Xcode is far superior to C#. In a just a few lines of code, I can have a list of in-memory strongly typed objects from a web-based JSON feed. To accomplish the same thing natively in .NET would be much more work and wouldn't feel nearly as clean when it was said and done. The snazzy icon shown above was built by my very talented wife. She hasn't yet provided any feedback on the app's user interface, which is why it is so plain and boring. I used a navigation view controller and EGO pull to refresh table view to construct the main window. Pulling down to refresh initiates a GPS lookup, which queries the database for radio stations in range (yes, you can pass parameters to Google spreadsheets and get a subset back in JSON). Pulling up on the table extends the range of the search and includes stations that may not be close enough to get clear audio. This feature is not that intuitive and the next version contains an update to that functionality. Tapping a cell will show a detail view that displays additional information about the station. The user can click to view the station on a map, click to listen to an online stream (if available) or click to see the station's Facebook or Twitter pages. Swiping back and forth on the table changes the information that is displayed on the right hand side of the table cell. It scrolls through the city where the tower is located, how far the phone is from the tower, the range of the tower and in the next version a signal strength indicator. This was pretty easy to implement once I figured out how to assign the gesture recognizer delegate.  Tapping and holding on a cell will jump the user to the map view screen. Which is pretty cool, but very hard for even a power user to discover. To tackle the issue of discoverability, the next version has a series of instructions displayed at the bottom of the screen to show the user the various shortcuts. Once the user has performed the swipes and long holds, the instructions disappear. I've learned a lot developing this app. Spending over a decade exclusively in .NET made the learning curve a bit steep, but once I learned the structure and syntax of Objective-C, I've learned to appreciate the power and simplicity of it. Here are a few screenshots. I would really appreciate any feedback and especially iTunes reviews. Technically it is open source and a smart googler could probably find it. I just haven't promoted it as open source.     Cross posted from timhibbard.com

    Read the article

  • Silverlight Recruiting Application Part 5 - Jobs Module / View

    Now we starting getting into a more code-heavy portion of this series, thankfully though this means the groundwork is all set for the most part and after adding the modules we will have a complete application that can be provided with full source. The Jobs module will have two concerns- adding and maintaining jobs that can then be broadcast out to the website. How they are displayed on the site will be handled by our admin system (which will just poll from this common database), so we aren't too concerned with that, but rather with getting the information into the system and allowing the backend administration/HR users to keep things up to date. Since there is a fair bit of information that we want to display, we're going to move editing to a separate view so we can get all that information in an easy-to-use spot. With all the files created for this module, the project looks something like this: And now... on to the code. XAML for the Job Posting View All we really need for the Job Posting View is a RadGridView and a few buttons. This will let us both show off records and perform operations on the records without much hassle. That XAML is going to look something like this: 01.<Grid x:Name="LayoutRoot" 02.Background="White"> 03.<Grid.RowDefinitions> 04.<RowDefinition Height="30" /> 05.<RowDefinition /> 06.</Grid.RowDefinitions> 07.<StackPanel Orientation="Horizontal"> 08.<Button x:Name="xAddRecordButton" 09.Content="Add Job" 10.Width="120" 11.cal:Click.Command="{Binding AddRecord}" 12.telerik:StyleManager.Theme="Windows7" /> 13.<Button x:Name="xEditRecordButton" 14.Content="Edit Job" 15.Width="120" 16.cal:Click.Command="{Binding EditRecord}" 17.telerik:StyleManager.Theme="Windows7" /> 18.</StackPanel> 19.<telerikGrid:RadGridView x:Name="xJobsGrid" 20.Grid.Row="1" 21.IsReadOnly="True" 22.AutoGenerateColumns="False" 23.ColumnWidth="*" 24.RowDetailsVisibilityMode="VisibleWhenSelected" 25.ItemsSource="{Binding MyJobs}" 26.SelectedItem="{Binding SelectedJob, Mode=TwoWay}" 27.command:SelectedItemChangedEventClass.Command="{Binding SelectedItemChanged}"> 28.<telerikGrid:RadGridView.Columns> 29.<telerikGrid:GridViewDataColumn Header="Job Title" 30.DataMemberBinding="{Binding JobTitle}" 31.UniqueName="JobTitle" /> 32.<telerikGrid:GridViewDataColumn Header="Location" 33.DataMemberBinding="{Binding Location}" 34.UniqueName="Location" /> 35.<telerikGrid:GridViewDataColumn Header="Resume Required" 36.DataMemberBinding="{Binding NeedsResume}" 37.UniqueName="NeedsResume" /> 38.<telerikGrid:GridViewDataColumn Header="CV Required" 39.DataMemberBinding="{Binding NeedsCV}" 40.UniqueName="NeedsCV" /> 41.<telerikGrid:GridViewDataColumn Header="Overview Required" 42.DataMemberBinding="{Binding NeedsOverview}" 43.UniqueName="NeedsOverview" /> 44.<telerikGrid:GridViewDataColumn Header="Active" 45.DataMemberBinding="{Binding IsActive}" 46.UniqueName="IsActive" /> 47.</telerikGrid:RadGridView.Columns> 48.</telerikGrid:RadGridView> 49.</Grid> I'll explain what's happening here by line numbers: Lines 11 and 16: Using the same type of click commands as we saw in the Menu module, we tie the button clicks to delegate commands in the viewmodel. Line 25: The source for the jobs will be a collection in the viewmodel. Line 26: We also bind the selected item to a public property from the viewmodel for use in code. Line 27: We've turned the event into a command so we can handle it via code in the viewmodel. So those first three probably make sense to you as far as Silverlight/WPF binding magic is concerned, but for line 27... This actually comes from something I read onDamien Schenkelman's blog back in the day for creating an attached behavior from any event. So, any time you see me using command:Whatever.Command, the backing for it is actually something like this: SelectedItemChangedEventBehavior.cs: 01.public class SelectedItemChangedEventBehavior : CommandBehaviorBase<Telerik.Windows.Controls.DataControl> 02.{ 03.public SelectedItemChangedEventBehavior(DataControl element) 04.: base(element) 05.{ 06.element.SelectionChanged += new EventHandler<SelectionChangeEventArgs>(element_SelectionChanged); 07.} 08.void element_SelectionChanged(object sender, SelectionChangeEventArgs e) 09.{ 10.// We'll only ever allow single selection, so will only need item index 0 11.base.CommandParameter = e.AddedItems[0]; 12.base.ExecuteCommand(); 13.} 14.} SelectedItemChangedEventClass.cs: 01.public class SelectedItemChangedEventClass 02.{ 03.#region The Command Stuff 04.public static ICommand GetCommand(DependencyObject obj) 05.{ 06.return (ICommand)obj.GetValue(CommandProperty); 07.} 08.public static void SetCommand(DependencyObject obj, ICommand value) 09.{ 10.obj.SetValue(CommandProperty, value); 11.} 12.public static readonly DependencyProperty CommandProperty = 13.DependencyProperty.RegisterAttached("Command", typeof(ICommand), 14.typeof(SelectedItemChangedEventClass), new PropertyMetadata(OnSetCommandCallback)); 15.public static void OnSetCommandCallback(DependencyObject dependencyObject, DependencyPropertyChangedEventArgs e) 16.{ 17.DataControl element = dependencyObject as DataControl; 18.if (element != null) 19.{ 20.SelectedItemChangedEventBehavior behavior = GetOrCreateBehavior(element); 21.behavior.Command = e.NewValue as ICommand; 22.} 23.} 24.#endregion 25.public static SelectedItemChangedEventBehavior GetOrCreateBehavior(DataControl element) 26.{ 27.SelectedItemChangedEventBehavior behavior = element.GetValue(SelectedItemChangedEventBehaviorProperty) as SelectedItemChangedEventBehavior; 28.if (behavior == null) 29.{ 30.behavior = new SelectedItemChangedEventBehavior(element); 31.element.SetValue(SelectedItemChangedEventBehaviorProperty, behavior); 32.} 33.return behavior; 34.} 35.public static SelectedItemChangedEventBehavior GetSelectedItemChangedEventBehavior(DependencyObject obj) 36.{ 37.return (SelectedItemChangedEventBehavior)obj.GetValue(SelectedItemChangedEventBehaviorProperty); 38.} 39.public static void SetSelectedItemChangedEventBehavior(DependencyObject obj, SelectedItemChangedEventBehavior value) 40.{ 41.obj.SetValue(SelectedItemChangedEventBehaviorProperty, value); 42.} 43.public static readonly DependencyProperty SelectedItemChangedEventBehaviorProperty = 44.DependencyProperty.RegisterAttached("SelectedItemChangedEventBehavior", 45.typeof(SelectedItemChangedEventBehavior), typeof(SelectedItemChangedEventClass), null); 46.} These end up looking very similar from command to command, but in a nutshell you create a command based on any event, determine what the parameter for it will be, then execute. It attaches via XAML and ties to a DelegateCommand in the viewmodel, so you get the full event experience (since some controls get a bit event-rich for added functionality). Simple enough, right? Viewmodel for the Job Posting View The Viewmodel is going to need to handle all events going back and forth, maintaining interactions with the data we are using, and both publishing and subscribing to events. Rather than breaking this into tons of little pieces, I'll give you a nice view of the entire viewmodel and then hit up the important points line-by-line: 001.public class JobPostingViewModel : ViewModelBase 002.{ 003.private readonly IEventAggregator eventAggregator; 004.private readonly IRegionManager regionManager; 005.public DelegateCommand<object> AddRecord { get; set; } 006.public DelegateCommand<object> EditRecord { get; set; } 007.public DelegateCommand<object> SelectedItemChanged { get; set; } 008.public RecruitingContext context; 009.private QueryableCollectionView _myJobs; 010.public QueryableCollectionView MyJobs 011.{ 012.get { return _myJobs; } 013.} 014.private QueryableCollectionView _selectionJobActionHistory; 015.public QueryableCollectionView SelectedJobActionHistory 016.{ 017.get { return _selectionJobActionHistory; } 018.} 019.private JobPosting _selectedJob; 020.public JobPosting SelectedJob 021.{ 022.get { return _selectedJob; } 023.set 024.{ 025.if (value != _selectedJob) 026.{ 027._selectedJob = value; 028.NotifyChanged("SelectedJob"); 029.} 030.} 031.} 032.public SubscriptionToken editToken = new SubscriptionToken(); 033.public SubscriptionToken addToken = new SubscriptionToken(); 034.public JobPostingViewModel(IEventAggregator eventAgg, IRegionManager regionmanager) 035.{ 036.// set Unity items 037.this.eventAggregator = eventAgg; 038.this.regionManager = regionmanager; 039.// load our context 040.context = new RecruitingContext(); 041.this._myJobs = new QueryableCollectionView(context.JobPostings); 042.context.Load(context.GetJobPostingsQuery()); 043.// set command events 044.this.AddRecord = new DelegateCommand<object>(this.AddNewRecord); 045.this.EditRecord = new DelegateCommand<object>(this.EditExistingRecord); 046.this.SelectedItemChanged = new DelegateCommand<object>(this.SelectedRecordChanged); 047.SetSubscriptions(); 048.} 049.#region DelegateCommands from View 050.public void AddNewRecord(object obj) 051.{ 052.this.eventAggregator.GetEvent<AddJobEvent>().Publish(true); 053.} 054.public void EditExistingRecord(object obj) 055.{ 056.if (_selectedJob == null) 057.{ 058.this.eventAggregator.GetEvent<NotifyUserEvent>().Publish("No job selected."); 059.} 060.else 061.{ 062.this._myJobs.EditItem(this._selectedJob); 063.this.eventAggregator.GetEvent<EditJobEvent>().Publish(this._selectedJob); 064.} 065.} 066.public void SelectedRecordChanged(object obj) 067.{ 068.if (obj.GetType() == typeof(ActionHistory)) 069.{ 070.// event bubbles up so we don't catch items from the ActionHistory grid 071.} 072.else 073.{ 074.JobPosting job = obj as JobPosting; 075.GrabHistory(job.PostingID); 076.} 077.} 078.#endregion 079.#region Subscription Declaration and Events 080.public void SetSubscriptions() 081.{ 082.EditJobCompleteEvent editComplete = eventAggregator.GetEvent<EditJobCompleteEvent>(); 083.if (editToken != null) 084.editComplete.Unsubscribe(editToken); 085.editToken = editComplete.Subscribe(this.EditCompleteEventHandler); 086.AddJobCompleteEvent addComplete = eventAggregator.GetEvent<AddJobCompleteEvent>(); 087.if (addToken != null) 088.addComplete.Unsubscribe(addToken); 089.addToken = addComplete.Subscribe(this.AddCompleteEventHandler); 090.} 091.public void EditCompleteEventHandler(bool complete) 092.{ 093.if (complete) 094.{ 095.JobPosting thisJob = _myJobs.CurrentEditItem as JobPosting; 096.this._myJobs.CommitEdit(); 097.this.context.SubmitChanges((s) => 098.{ 099.ActionHistory myAction = new ActionHistory(); 100.myAction.PostingID = thisJob.PostingID; 101.myAction.Description = String.Format("Job '{0}' has been edited by {1}", thisJob.JobTitle, "default user"); 102.myAction.TimeStamp = DateTime.Now; 103.eventAggregator.GetEvent<AddActionEvent>().Publish(myAction); 104.} 105., null); 106.} 107.else 108.{ 109.this._myJobs.CancelEdit(); 110.} 111.this.MakeMeActive(this.regionManager, "MainRegion", "JobPostingsView"); 112.} 113.public void AddCompleteEventHandler(JobPosting job) 114.{ 115.if (job == null) 116.{ 117.// do nothing, new job add cancelled 118.} 119.else 120.{ 121.this.context.JobPostings.Add(job); 122.this.context.SubmitChanges((s) => 123.{ 124.ActionHistory myAction = new ActionHistory(); 125.myAction.PostingID = job.PostingID; 126.myAction.Description = String.Format("Job '{0}' has been added by {1}", job.JobTitle, "default user"); 127.myAction.TimeStamp = DateTime.Now; 128.eventAggregator.GetEvent<AddActionEvent>().Publish(myAction); 129.} 130., null); 131.} 132.this.MakeMeActive(this.regionManager, "MainRegion", "JobPostingsView"); 133.} 134.#endregion 135.public void GrabHistory(int postID) 136.{ 137.context.ActionHistories.Clear(); 138._selectionJobActionHistory = new QueryableCollectionView(context.ActionHistories); 139.context.Load(context.GetHistoryForJobQuery(postID)); 140.} Taking it from the top, we're injecting an Event Aggregator and Region Manager for use down the road and also have the public DelegateCommands (just like in the Menu module). We also grab a reference to our context, which we'll obviously need for data, then set up a few fields with public properties tied to them. We're also setting subscription tokens, which we have not yet seen but I will get into below. The AddNewRecord (50) and EditExistingRecord (54) methods should speak for themselves for functionality, the one thing of note is we're sending events off to the Event Aggregator which some module, somewhere will take care of. Since these aren't entirely relying on one another, the Jobs View doesn't care if anyone is listening, but it will publish AddJobEvent (52), NotifyUserEvent (58) and EditJobEvent (63)regardless. Don't mind the GrabHistory() method so much, that is just grabbing history items (visibly being created in the SubmitChanges callbacks), and adding them to the database. Every action will trigger a history event, so we'll know who modified what and when, just in case. ;) So where are we at? Well, if we click to Add a job, we publish an event, if we edit a job, we publish an event with the selected record (attained through the magic of binding). Where is this all going though? To the Viewmodel, of course! XAML for the AddEditJobView This is pretty straightforward except for one thing, noted below: 001.<Grid x:Name="LayoutRoot" 002.Background="White"> 003.<Grid x:Name="xEditGrid" 004.Margin="10" 005.validationHelper:ValidationScope.Errors="{Binding Errors}"> 006.<Grid.Background> 007.<LinearGradientBrush EndPoint="0.5,1" 008.StartPoint="0.5,0"> 009.<GradientStop Color="#FFC7C7C7" 010.Offset="0" /> 011.<GradientStop Color="#FFF6F3F3" 012.Offset="1" /> 013.</LinearGradientBrush> 014.</Grid.Background> 015.<Grid.RowDefinitions> 016.<RowDefinition Height="40" /> 017.<RowDefinition Height="40" /> 018.<RowDefinition Height="40" /> 019.<RowDefinition Height="100" /> 020.<RowDefinition Height="100" /> 021.<RowDefinition Height="100" /> 022.<RowDefinition Height="40" /> 023.<RowDefinition Height="40" /> 024.<RowDefinition Height="40" /> 025.</Grid.RowDefinitions> 026.<Grid.ColumnDefinitions> 027.<ColumnDefinition Width="150" /> 028.<ColumnDefinition Width="150" /> 029.<ColumnDefinition Width="300" /> 030.<ColumnDefinition Width="100" /> 031.</Grid.ColumnDefinitions> 032.<!-- Title --> 033.<TextBlock Margin="8" 034.Text="{Binding AddEditString}" 035.TextWrapping="Wrap" 036.Grid.Column="1" 037.Grid.ColumnSpan="2" 038.FontSize="16" /> 039.<!-- Data entry area--> 040. 041.<TextBlock Margin="8,0,0,0" 042.Style="{StaticResource LabelTxb}" 043.Grid.Row="1" 044.Text="Job Title" 045.VerticalAlignment="Center" /> 046.<TextBox x:Name="xJobTitleTB" 047.Margin="0,8" 048.Grid.Column="1" 049.Grid.Row="1" 050.Text="{Binding activeJob.JobTitle, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 051.Grid.ColumnSpan="2" /> 052.<TextBlock Margin="8,0,0,0" 053.Grid.Row="2" 054.Text="Location" 055.d:LayoutOverrides="Height" 056.VerticalAlignment="Center" /> 057.<TextBox x:Name="xLocationTB" 058.Margin="0,8" 059.Grid.Column="1" 060.Grid.Row="2" 061.Text="{Binding activeJob.Location, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 062.Grid.ColumnSpan="2" /> 063. 064.<TextBlock Margin="8,11,8,0" 065.Grid.Row="3" 066.Text="Description" 067.TextWrapping="Wrap" 068.VerticalAlignment="Top" /> 069. 070.<TextBox x:Name="xDescriptionTB" 071.Height="84" 072.TextWrapping="Wrap" 073.ScrollViewer.VerticalScrollBarVisibility="Auto" 074.Grid.Column="1" 075.Grid.Row="3" 076.Text="{Binding activeJob.Description, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 077.Grid.ColumnSpan="2" /> 078.<TextBlock Margin="8,11,8,0" 079.Grid.Row="4" 080.Text="Requirements" 081.TextWrapping="Wrap" 082.VerticalAlignment="Top" /> 083. 084.<TextBox x:Name="xRequirementsTB" 085.Height="84" 086.TextWrapping="Wrap" 087.ScrollViewer.VerticalScrollBarVisibility="Auto" 088.Grid.Column="1" 089.Grid.Row="4" 090.Text="{Binding activeJob.Requirements, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 091.Grid.ColumnSpan="2" /> 092.<TextBlock Margin="8,11,8,0" 093.Grid.Row="5" 094.Text="Qualifications" 095.TextWrapping="Wrap" 096.VerticalAlignment="Top" /> 097. 098.<TextBox x:Name="xQualificationsTB" 099.Height="84" 100.TextWrapping="Wrap" 101.ScrollViewer.VerticalScrollBarVisibility="Auto" 102.Grid.Column="1" 103.Grid.Row="5" 104.Text="{Binding activeJob.Qualifications, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}" 105.Grid.ColumnSpan="2" /> 106.<!-- Requirements Checkboxes--> 107. 108.<CheckBox x:Name="xResumeRequiredCB" Margin="8,8,8,15" 109.Content="Resume Required" 110.Grid.Row="6" 111.Grid.ColumnSpan="2" 112.IsChecked="{Binding activeJob.NeedsResume, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}"/> 113. 114.<CheckBox x:Name="xCoverletterRequiredCB" Margin="8,8,8,15" 115.Content="Cover Letter Required" 116.Grid.Column="2" 117.Grid.Row="6" 118.IsChecked="{Binding activeJob.NeedsCV, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}"/> 119. 120.<CheckBox x:Name="xOverviewRequiredCB" Margin="8,8,8,15" 121.Content="Overview Required" 122.Grid.Row="7" 123.Grid.ColumnSpan="2" 124.IsChecked="{Binding activeJob.NeedsOverview, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}"/> 125. 126.<CheckBox x:Name="xJobActiveCB" Margin="8,8,8,15" 127.Content="Job is Active" 128.Grid.Column="2" 129.Grid.Row="7" 130.IsChecked="{Binding activeJob.IsActive, Mode=TwoWay, NotifyOnValidationError=True, ValidatesOnExceptions=True}"/> 131. 132.<!-- Buttons --> 133. 134.<Button x:Name="xAddEditButton" Margin="8,8,0,10" 135.Content="{Binding AddEditButtonString}" 136.cal:Click.Command="{Binding AddEditCommand}" 137.Grid.Column="2" 138.Grid.Row="8" 139.HorizontalAlignment="Left" 140.Width="125" 141.telerik:StyleManager.Theme="Windows7" /> 142. 143.<Button x:Name="xCancelButton" HorizontalAlignment="Right" 144.Content="Cancel" 145.cal:Click.Command="{Binding CancelCommand}" 146.Margin="0,8,8,10" 147.Width="125" 148.Grid.Column="2" 149.Grid.Row="8" 150.telerik:StyleManager.Theme="Windows7" /> 151.</Grid> 152.</Grid> The 'validationHelper:ValidationScope' line may seem odd. This is a handy little trick for catching current and would-be validation errors when working in this whole setup. This all comes from an approach found on theJoy Of Code blog, although it looks like the story for this will be changing slightly with new advances in SL4/WCF RIA Services, so this section can definitely get an overhaul a little down the road. The code is the fun part of all this, so let us see what's happening under the hood. Viewmodel for the AddEditJobView We are going to see some of the same things happening here, so I'll skip over the repeat info and get right to the good stuff: 001.public class AddEditJobViewModel : ViewModelBase 002.{ 003.private readonly IEventAggregator eventAggregator; 004.private readonly IRegionManager regionManager; 005. 006.public RecruitingContext context; 007. 008.private JobPosting _activeJob; 009.public JobPosting activeJob 010.{ 011.get { return _activeJob; } 012.set 013.{ 014.if (_activeJob != value) 015.{ 016._activeJob = value; 017.NotifyChanged("activeJob"); 018.} 019.} 020.} 021. 022.public bool isNewJob; 023. 024.private string _addEditString; 025.public string AddEditString 026.{ 027.get { return _addEditString; } 028.set 029.{ 030.if (_addEditString != value) 031.{ 032._addEditString = value; 033.NotifyChanged("AddEditString"); 034.} 035.} 036.} 037. 038.private string _addEditButtonString; 039.public string AddEditButtonString 040.{ 041.get { return _addEditButtonString; } 042.set 043.{ 044.if (_addEditButtonString != value) 045.{ 046._addEditButtonString = value; 047.NotifyChanged("AddEditButtonString"); 048.} 049.} 050.} 051. 052.public SubscriptionToken addJobToken = new SubscriptionToken(); 053.public SubscriptionToken editJobToken = new SubscriptionToken(); 054. 055.public DelegateCommand<object> AddEditCommand { get; set; } 056.public DelegateCommand<object> CancelCommand { get; set; } 057. 058.private ObservableCollection<ValidationError> _errors = new ObservableCollection<ValidationError>(); 059.public ObservableCollection<ValidationError> Errors 060.{ 061.get { return _errors; } 062.} 063. 064.private ObservableCollection<ValidationResult> _valResults = new ObservableCollection<ValidationResult>(); 065.public ObservableCollection<ValidationResult> ValResults 066.{ 067.get { return this._valResults; } 068.} 069. 070.public AddEditJobViewModel(IEventAggregator eventAgg, IRegionManager regionmanager) 071.{ 072.// set Unity items 073.this.eventAggregator = eventAgg; 074.this.regionManager = regionmanager; 075. 076.context = new RecruitingContext(); 077. 078.AddEditCommand = new DelegateCommand<object>(this.AddEditJobCommand); 079.CancelCommand = new DelegateCommand<object>(this.CancelAddEditCommand); 080. 081.SetSubscriptions(); 082.} 083. 084.#region Subscription Declaration and Events 085. 086.public void SetSubscriptions() 087.{ 088.AddJobEvent addJob = this.eventAggregator.GetEvent<AddJobEvent>(); 089. 090.if (addJobToken != null) 091.addJob.Unsubscribe(addJobToken); 092. 093.addJobToken = addJob.Subscribe(this.AddJobEventHandler); 094. 095.EditJobEvent editJob = this.eventAggregator.GetEvent<EditJobEvent>(); 096. 097.if (editJobToken != null) 098.editJob.Unsubscribe(editJobToken); 099. 100.editJobToken = editJob.Subscribe(this.EditJobEventHandler); 101.} 102. 103.public void AddJobEventHandler(bool isNew) 104.{ 105.this.activeJob = null; 106.this.activeJob = new JobPosting(); 107.this.activeJob.IsActive = true; // We assume that we want a new job to go up immediately 108.this.isNewJob = true; 109.this.AddEditString = "Add New Job Posting"; 110.this.AddEditButtonString = "Add Job"; 111. 112.MakeMeActive(this.regionManager, "MainRegion", "AddEditJobView"); 113.} 114. 115.public void EditJobEventHandler(JobPosting editJob) 116.{ 117.this.activeJob = null; 118.this.activeJob = editJob; 119.this.isNewJob = false; 120.this.AddEditString = "Edit Job Posting"; 121.this.AddEditButtonString = "Edit Job"; 122. 123.MakeMeActive(this.regionManager, "MainRegion", "AddEditJobView"); 124.} 125. 126.#endregion 127. 128.#region DelegateCommands from View 129. 130.public void AddEditJobCommand(object obj) 131.{ 132.if (this.Errors.Count > 0) 133.{ 134.List<string> errorMessages = new List<string>(); 135. 136.foreach (var valR in this.Errors) 137.{ 138.errorMessages.Add(valR.Exception.Message); 139.} 140. 141.this.eventAggregator.GetEvent<DisplayValidationErrorsEvent>().Publish(errorMessages); 142. 143.} 144.else if (!Validator.TryValidateObject(this.activeJob, new ValidationContext(this.activeJob, null, null), _valResults, true)) 145.{ 146.List<string> errorMessages = new List<string>(); 147. 148.foreach (var valR in this._valResults) 149.{ 150.errorMessages.Add(valR.ErrorMessage); 151.} 152. 153.this._valResults.Clear(); 154. 155.this.eventAggregator.GetEvent<DisplayValidationErrorsEvent>().Publish(errorMessages); 156.} 157.else 158.{ 159.if (this.isNewJob) 160.{ 161.this.eventAggregator.GetEvent<AddJobCompleteEvent>().Publish(this.activeJob); 162.} 163.else 164.{ 165.this.eventAggregator.GetEvent<EditJobCompleteEvent>().Publish(true); 166.} 167.} 168.} 169. 170.public void CancelAddEditCommand(object obj) 171.{ 172.if (this.isNewJob) 173.{ 174.this.eventAggregator.GetEvent<AddJobCompleteEvent>().Publish(null); 175.} 176.else 177.{ 178.this.eventAggregator.GetEvent<EditJobCompleteEvent>().Publish(false); 179.} 180.} 181. 182.#endregion 183.} 184.} We start seeing something new on line 103- the AddJobEventHandler will create a new job and set that to the activeJob item on the ViewModel. When this is all set, the view calls that familiar MakeMeActive method to activate itself. I made a bit of a management call on making views self-activate like this, but I figured it works for one reason. As I create this application, views may not exist that I have in mind, so after a view receives its 'ping' from being subscribed to an event, it prepares whatever it needs to do and then goes active. This way if I don't have 'edit' hooked up, I can click as the day is long on the main view and won't get lost in an empty region. Total personal preference here. :) Everything else should again be pretty straightforward, although I do a bit of validation checking in the AddEditJobCommand, which can either fire off an event back to the main view/viewmodel if everything is a success or sent a list of errors to our notification module, which pops open a RadWindow with the alerts if any exist. As a bonus side note, here's what my WCF RIA Services metadata looks like for handling all of the validation: private JobPostingMetadata() { } [StringLength(2500, ErrorMessage = "Description should be more than one and less than 2500 characters.", MinimumLength = 1)] [Required(ErrorMessage = "Description is required.")] public string Description; [Required(ErrorMessage="Active Status is Required")] public bool IsActive; [StringLength(100, ErrorMessage = "Posting title must be more than 3 but less than 100 characters.", MinimumLength = 3)] [Required(ErrorMessage = "Job Title is required.")] public bool JobTitle; [Required] public string Location; public bool NeedsCV; public bool NeedsOverview; public bool NeedsResume; public int PostingID; [Required(ErrorMessage="Qualifications are required.")] [StringLength(2500, ErrorMessage="Qualifications should be more than one and less than 2500 characters.", MinimumLength=1)] public string Qualifications; [StringLength(2500, ErrorMessage = "Requirements should be more than one and less than 2500 characters.", MinimumLength = 1)] [Required(ErrorMessage="Requirements are required.")] public string Requirements;   The RecruitCB Alternative See all that Xaml I pasted above? Those are now two pieces sitting in the JobsView.xaml file now. The only real difference is that the xEditGrid now sits in the same place as xJobsGrid, with visibility swapping out between the two for a quick switch. I also took out all the cal: and command: command references and replaced Button events with clicks and the Grid selection command replaced with a SelectedItemChanged event. Also, at the bottom of the xEditGrid after the last button, I add a ValidationSummary (with Visibility=Collapsed) to catch any errors that are popping up. Simple as can be, and leads to this being the single code-behind file: 001.public partial class JobsView : UserControl 002.{ 003.public RecruitingContext context; 004.public JobPosting activeJob; 005.public bool isNew; 006.private ObservableCollection<ValidationResult> _valResults = new ObservableCollection<ValidationResult>(); 007.public ObservableCollection<ValidationResult> ValResults 008.{ 009.get { return this._valResults; } 010.} 011.public JobsView() 012.{ 013.InitializeComponent(); 014.this.Loaded += new RoutedEventHandler(JobsView_Loaded); 015.} 016.void JobsView_Loaded(object sender, RoutedEventArgs e) 017.{ 018.context = new RecruitingContext(); 019.xJobsGrid.ItemsSource = context.JobPostings; 020.context.Load(context.GetJobPostingsQuery()); 021.} 022.private void xAddRecordButton_Click(object sender, RoutedEventArgs e) 023.{ 024.activeJob = new JobPosting(); 025.isNew = true; 026.xAddEditTitle.Text = "Add a Job Posting"; 027.xAddEditButton.Content = "Add"; 028.xEditGrid.DataContext = activeJob; 029.HideJobsGrid(); 030.} 031.private void xEditRecordButton_Click(object sender, RoutedEventArgs e) 032.{ 033.activeJob = xJobsGrid.SelectedItem as JobPosting; 034.isNew = false; 035.xAddEditTitle.Text = "Edit a Job Posting"; 036.xAddEditButton.Content = "Edit"; 037.xEditGrid.DataContext = activeJob; 038.HideJobsGrid(); 039.} 040.private void xAddEditButton_Click(object sender, RoutedEventArgs e) 041.{ 042.if (!Validator.TryValidateObject(this.activeJob, new ValidationContext(this.activeJob, null, null), _valResults, true)) 043.{ 044.List<string> errorMessages = new List<string>(); 045.foreach (var valR in this._valResults) 046.{ 047.errorMessages.Add(valR.ErrorMessage); 048.} 049.this._valResults.Clear(); 050.ShowErrors(errorMessages); 051.} 052.else if (xSummary.Errors.Count > 0) 053.{ 054.List<string> errorMessages = new List<string>(); 055.foreach (var err in xSummary.Errors) 056.{ 057.errorMessages.Add(err.Message); 058.} 059.ShowErrors(errorMessages); 060.} 061.else 062.{ 063.if (this.isNew) 064.{ 065.context.JobPostings.Add(activeJob); 066.context.SubmitChanges((s) => 067.{ 068.ActionHistory thisAction = new ActionHistory(); 069.thisAction.PostingID = activeJob.PostingID; 070.thisAction.Description = String.Format("Job '{0}' has been edited by {1}", activeJob.JobTitle, "default user"); 071.thisAction.TimeStamp = DateTime.Now; 072.context.ActionHistories.Add(thisAction); 073.context.SubmitChanges(); 074.}, null); 075.} 076.else 077.{ 078.context.SubmitChanges((s) => 079.{ 080.ActionHistory thisAction = new ActionHistory(); 081.thisAction.PostingID = activeJob.PostingID; 082.thisAction.Description = String.Format("Job '{0}' has been added by {1}", activeJob.JobTitle, "default user"); 083.thisAction.TimeStamp = DateTime.Now; 084.context.ActionHistories.Add(thisAction); 085.context.SubmitChanges(); 086.}, null); 087.} 088.ShowJobsGrid(); 089.} 090.} 091.private void xCancelButton_Click(object sender, RoutedEventArgs e) 092.{ 093.ShowJobsGrid(); 094.} 095.private void ShowJobsGrid() 096.{ 097.xAddEditRecordButtonPanel.Visibility = Visibility.Visible; 098.xEditGrid.Visibility = Visibility.Collapsed; 099.xJobsGrid.Visibility = Visibility.Visible; 100.} 101.private void HideJobsGrid() 102.{ 103.xAddEditRecordButtonPanel.Visibility = Visibility.Collapsed; 104.xJobsGrid.Visibility = Visibility.Collapsed; 105.xEditGrid.Visibility = Visibility.Visible; 106.} 107.private void ShowErrors(List<string> errorList) 108.{ 109.string nm = "Errors received: \n"; 110.foreach (string anerror in errorList) 111.nm += anerror + "\n"; 112.RadWindow.Alert(nm); 113.} 114.} The first 39 lines should be pretty familiar, not doing anything too unorthodox to get this up and running. Once we hit the xAddEditButton_Click on line 40, we're still doing pretty much the same things except instead of checking the ValidationHelper errors, we both run a check on the current activeJob object as well as check the ValidationSummary errors list. Once that is set, we again use the callback of context.SubmitChanges (lines 68 and 78) to create an ActionHistory which we will use to track these items down the line. That's all? Essentially... yes. If you look back through this post, most of the code and adventures we have taken were just to get things working in the MVVM/Prism setup. Since I have the whole 'module' self-contained in a single JobView+code-behind setup, I don't have to worry about things like sending events off into space for someone to pick up, communicating through an Infrastructure project, or even re-inventing events to be used with attached behaviors. Everything just kinda works, and again with much less code. Here's a picture of the MVVM and Code-behind versions on the Jobs and AddEdit views, but since the functionality is the same in both apps you still cannot tell them apart (for two-strike): Looking ahead, the Applicants module is effectively the same thing as the Jobs module, so most of the code is being cut-and-pasted back and forth with minor tweaks here and there. So that one is being taken care of by me behind the scenes. Next time, we get into a new world of fun- the interview scheduling module, which will pull from available jobs and applicants for each interview being scheduled, tying everything together with RadScheduler to the rescue. Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Backing up my Windows Home Server to the Cloud&hellip;

    - by eddraper
    Ok, here’s my scenario: Windows Home Server with a little over 3TB of storage.  This includes many years of our home network’s PC backups, music, videos, etcetera. I’d like to get a backup off-site, and the existing APIs and apps such as CloudBerry Labs WHS Backup service are making it easy.  Now, all it’s down to is vendor and the cost of the actual storage.   So,  I thought I’d take a lazy Saturday morning and do some research on this and get the ball rolling.  What I discovered stunned me…   First off, the pricing for just about everything was loaded with complexity.  I learned that it wasn’t just about storage… it was about network usage, requests, sites, replication, and on and on. I really don’t see this as rocket science.  I have a disk image.  I want to put it in the cloud.  I’m not going to be be using it but once daily for incremental backups.  Sounds like a common scenario.  Yes, if “things get real” and my server goes down, I will need to bring down a lot of data and utilize a fair amount of vendor infrastructure.  However, this may never happen.  Offsite storage is an insurance policy.   The complexity of the cost structures, perhaps by design, create an environment where it’s incredibly hard to model bottom line costs and compare vendor all-up pricing.  As it is a “lazy Saturday morning,” I’m not in the mood for such antics and I decide to shirk the endeavor entirely.  Thus, I decided to simply fire up calc.exe and do some a simple arithmetic model based on price per GB.  I shuddered at the results.  Certainly something was wrong… did I misplace a decimal point?  Then I discovered CloudBerry’s own calculator.   Nope, I hadn’t misplaced those decimals after all.  Check it out (pricing based on 3174 GB):   Amazon S3 $398.00 per month $4761 per year Azure $396.75 per month $4761 per year Google $380.88 per month $4570.56 per year   Conclusion: Rampant crack smoking at vendors.  Seriously.  Out. Of. Their. Minds. Now, to Amazon’s credit, vision, and outright common sense, they had one offering which directly addresses my scenario:   Amazon Glacier $31.74 per month $380.88 per year   hmmm… It’s on the table.  Let’s see what it would cost to just buy some drives, an enclosure and cart them over to a friend’s house.   2 x 2TB Drives from NewEgg.com $199.99   Enclosure $39.99     $239.98   Carting data to back and forth to friend’s within walking distance pain   Leave drive unplugged at friend’s $0 for electricity   Possible data loss No way I can come and go every day.     I think I’ll think on this a bit more…

    Read the article

  • Dude, what’s up with POP Forums vNext?

    - by Jeff
    Yeah, it has been awhile. I posted v9.2 back in January, about five months ago. That’s a real change from the release pace I had there for awhile. Let me explain what’s going on. First off, in the interim, I re-launched CoasterBuzz, which required a lot of my attention for about two of those months. That’s a good thing though, because that site is just about the best test bed I could ask for. The other thing is that I committed to make the next version use ASP.NET MVC 4, which is now at the RC stage. I didn’t think much about when they’d hit their RTW point, but RC is good enough for me. To that end, there is enough change in the next version that I recently decided to make it a major version upgrade, and finish up the loose ends and science projects to make it whole. Here’s what’s in store… Mobile views: I sat on this or a long time. Originally, I was going to use jQuery Mobile, and waited and waited for a new release, but in the end, decided against using it. Sometimes buttons would unexplainably not work, I felt like I was fighting it at times, and the CSS just felt too heavy. I rolled my own mobile sugar at a fraction of the size, and I think you’ll find it easy to modify. And it’s Metro-y, of course! Re-do of background services: A number of things run in the background, and I did quite a bit of “reimagining” of that code. It’s the weirdness of running services in a Web site context, because so many folks can’t run a bona fide service on their host’s box. The biggest change here is that these service no longer start up by default. You’ll need to call a new method from global.asax called PopForumsActivation.StartServices(). This is also a precursor to running the app in a Web farm (new data layer and caching is the second part of that). I learned about this the hard way when I had three apps using the forum library code but only one was actually the forum. The services were all running three times as often with race conditions and hits on the same data. That was particularly bad for e-mail. CSS clean up: It’s still not ideal, but it’s getting better. That’s one of those things that comes with integrating to a real site… you discover all of the dumb things you did. The mobile CSS is particularly easier to live with. Bug fixes: There are a whole lot of them. Most were minor, but it’s feeling pretty solid now. So that’s where I am. I’m going to call it v10.0, and I’m going to really put forth some effort toward finishing the mobile experience and getting through the remaining bugs. The roadmap beyond that will likely not be feature oriented, but rather work on some other things, like making it run in Azure, perhaps using SQL CE, a better install experience, etc. As usual, I’ll post the latest here. Stay tuned!

    Read the article

  • SQL SERVER – Understanding XML – Contest Win Joes 2 Pros Combo (USD 198) – Day 5 of 5

    - by pinaldave
    August 2011 we ran a contest where every day we give away one book for an entire month. The contest had extreme success. Lots of people participated and lots of give away. I have received lots of questions if we are doing something similar this month. Absolutely, instead of running a contest a month long we are doing something more interesting. We are giving away USD 198 worth gift every day for this week. We are giving away Joes 2 Pros 5 Volumes (BOOK) SQL 2008 Development Certification Training Kit every day. One copy in India and One in USA. Total 2 of the giveaway (worth USD 198). All the gifts are sponsored from the Koenig Training Solution and Joes 2 Pros. The books are available here Amazon | Flipkart | Indiaplaza How to Win: Read the Question Read the Hints Answer the Quiz in Contact Form in following format Question Answer Name of the country (The contest is open for USA and India residents only) 2 Winners will be randomly selected announced on August 20th. Question of the Day: Is following XML a well formed XML Document? <?xml version=”1.0″?> <address> <firstname>Pinal</firstname> <lastname>Dave</lastname> <title>Founder</title> <company>SQLAuthority.com</company> </address> a) Yes b) No c) I do not know Query Hints: BIG HINT POST A common observation by people seeing an XML file for the first time is that it looks like just a bunch of data inside a text file. XML files are text-based documents, which makes them easy to read.  All of the data is literally spelled out in the document and relies on a just a few characters (<, >, =) to convey relationships and structure of the data.  XML files can be used by any commonly available text editor, like Notepad. Much like a book’s Table of Contents, your first glance at well-formed XML will tell you the subject matter of the data and its general structure. Hints appearing within the data help you to quickly identify the main theme (similar to book’s subject), its headers (similar to chapter titles or sections of a book), data elements (similar to a book’s characters or chief topics), and so forth. We’ll learn to recognize and use the structural “hints,” which are XML’s markup components (e.g., XML tags, root elements). The XML Raw and Auto modes are great for displaying data as all attributes or all elements – but not both at once. If you want your XML stream to have some of its data shown in attributes and some shown as elements, then you can use the XML Path mode. If you are using an XML Path stream, then by default all values will be shown as elements. However, it is possible to pick one or more elements to be shown with an attribute(s) as well. Additional Hints: I have previously discussed various concepts from SQL Server Joes 2 Pros Volume 5. SQL Joes 2 Pros Development Series – OpenXML Options SQL Joes 2 Pros Development Series – Preparing XML in Memory SQL Joes 2 Pros Development Series – Shredding XML SQL Joes 2 Pros Development Series – Using Root With Auto XML Mode SQL Joes 2 Pros Development Series – Using Root With Auto XML Mode SQL Joes 2 Pros Development Series – What is XML? SQL Joes 2 Pros Development Series – What is XML? – 2 Next Step: Answer the Quiz in Contact Form in following format Question - Answer Name of the country (The contest is open for USA and India) Bonus Winner Leave a comment with your favorite article from the “additional hints” section and you may be eligible for surprise gift. There is no country restriction for this Bonus Contest. Do mention why you liked it any particular blog post and I will announce the winner of the same along with the main contest. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Building Extensions Using E-Business Suite SDK for Java

    - by Sara Woodhull
    We’ve just released Version 2.0.1 of Oracle E-Business Suite SDK for Java.  This new version has several great enhancements added after I wrote about the first version of the SDK in 2010.  In addition to the AppsDataSource and Java Authentication and Authorization Service (JAAS) features that are in the first version, the Oracle E-Business Suite SDK for Java now provides: Session management APIs, so you can share session information with Oracle E-Business Suite Setup script for UNIX/Linux for AppsDataSource and JAAS on Oracle WebLogic Server APIs for Message Dictionary, User Profiles, and NLS Javadoc for the APIs (included with the patch) Enhanced documentation included with Note 974949.1 These features can be used with either Release 11i or Release 12.  References AppsDataSource, Java Authentication and Authorization Service, and Utilities for Oracle E-Business Suite (Note 974949.1) FAQ for Integration of Oracle E-Business Suite and Oracle Application Development Framework (ADF) Applications (Doc ID 1296491.1) What's new in those references? Note 974949.1 is the place to look for the latest information as we come out with new versions of the SDK.  The patch number changes for each release.  Version 2.0.1 is contained in Patch 13882058, which is for both Release 11i and Release 12.  Note 974949.1 includes the following topics: Applying the latest patch Using Oracle E-Business Suite Data Sources Oracle E-Business Suite Implementation of Java Authentication and Authorization Service (JAAS) Utilities Error loggingSession management  Message Dictionary User profiles Navigation to External Applications Java EE Session Management Tutorial For those of you using the SDK with Oracle ADF, besides some Oracle ADF-specific documentation in Note 974949.1, we also updated the ADF Integration FAQ as well. EBS SDK for Java Use Cases The uses of the Oracle E-Business Suite SDK for Java fall into two general scenarios for integrating external applications with Oracle E-Business Suite: Application sharing a session with Oracle E-Business Suite Independent application (not shared session) With an independent application, the external application accesses Oracle E-Business  Suite data and server-side APIs, but it has a completely separate user interface. The external application may also launch pages from the Oracle E-Business Suite home page, but after the initial launch there is no further communication with the Oracle E-Business Suite user interface. Shared session integration means that the external application uses an Oracle E-Business Suite session (ICX session), shares session context information with Oracle E-Business Suite, and accesses Oracle E-Business Suite data. The external application may also launch pages from the Oracle E-Business Suite home page, or regions or pages from the external application may be embedded as regions within Oracle Application Framework pages. Both shared session applications and independent applications use the AppsDataSource feature of the Oracle E-Business Suite SDK for Java. Independent applications may also use the Java Authentication and Authorization (JAAS) and logging features of the SDK. Applications that are sharing the Oracle E-Business Suite session use the session management feature (instead of the JAAS feature), and they may also use the logging, profiles, and Message Dictionary features of the SDK.  The session management APIs allow you to create, retrieve, validate and cancel an Oracle E-Business Suite session (ICX session) from your external application.  Session information and context can travel back and forth between Oracle E-Business Suite and your application, allowing you to share session context information across applications. Note: Generally you would use the Java Authentication and Authorization (JAAS) feature of the SDK or the session management feature, but not both together. Send us your feedback Since the Oracle E-Business Suite SDK for Java is still pretty new, we’d like to know about who is using it and what you are trying to do with it.  We’d like to get this type of information: customer name and brief use case configuration and technologies (Oracle WebLogic Server or OC4J, plain Java, ADF, SOA Suite, and so on) project status (proof of concept, development, production) any other feedback you have about the SDK You can send me your feedback directly at Sara dot Woodhull at Oracle dot com, or you can leave it in the comments below.  Please keep in mind that we cannot answer support questions, so if you are having specific issues, please log a service request with Oracle Support. Happy coding! Related Articles New Whitepaper: Extending E-Business Suite 12.1.3 using Oracle Application Express To Customize or Not to Customize? New Whitepaper: Upgrading your Customizations to Oracle E-Business Suite Release 12 ATG Live Webcast: Upgrading your EBS 11i Customizations to Release 12

    Read the article

  • Having fun with Reflection

    - by Nick Harrison
    I was once asked in a technical interview what I could tell them about Reflection.   My response, while a little tongue in cheek was that "I can tell you it is one of my favorite topics to talk about" I did get a laugh out of that and it was a great ice breaker.    Reflection may not be the answer for everything, but it often can be, or maybe even should be.     I have posted in the past about my favorite CopyTo method.   It can come in several forms and is often very useful.   I explain it further and expand on the basic idea here  The basic idea is to allow reflection to loop through the properties of two objects and synchronize the ones that are in common.   I love this approach for data binding and passing data across the layers in an application. Recently I have been working on a project leveraging Data Transfer Objects to pass data through WCF calls.   We won't go into how the architecture got this way, but in essence there is a partial duplicate inheritance hierarchy where there is a related Domain Object for each Data Transfer Object.     The matching objects do not share a common ancestor or common interface but they will have the same properties in common.    By passing the problems with this approach, let's talk about how Reflection and our friendly CopyTo could make the most of this bad situation without having to change too much. One of the problems is keeping the two sets of objects in synch.   For this particular project, the DO has all of the functionality and the DTO is used to simply transfer data back and forth.    Both sets of object have parallel hierarchies with the same properties being defined at the corresponding levels.   So we end with BaseDO,  BaseDTO, GenericDO, GenericDTO, ProcessAreaDO,  ProcessAreaDTO, SpecializedProcessAreaDO, SpecializedProcessAreaDTO, TableDo, TableDto. and so on. Without using Reflection and a CopyTo function, tremendous care and effort must be made to keep the corresponding objects in synch.    New properties can be added at any level in the inheritance and must be kept in synch at all subsequent layers.    For this project we have come up with a clever approach of calling a base GetDo or UpdateDto making sure that the same method at each level of inheritance is called.    Each level is responsible for updating the properties at that level. This is a lot of work and not keeping it in synch can create all manner of problems some of which are very difficult to track down.    The other problem is the type of code that this methods tend to wind up with. You end up with code like this: Transferable dto = new Transferable(); base.GetDto(dto); dto.OfficeCode = GetDtoNullSafe(officeCode); dto.AccessIndicator = GetDtoNullSafe(accessIndicator); dto.CaseStatus = GetDtoNullSafe(caseStatus); dto.CaseStatusReason = GetDtoNullSafe(caseStatusReason); dto.LevelOfService = GetDtoNullSafe(levelOfService); dto.ReferralComments = referralComments; dto.Designation = GetDtoNullSafe(designation); dto.IsGoodCauseClaimed = GetDtoNullSafe(isGoodCauseClaimed); dto.GoodCauseClaimDate = goodCauseClaimDate;       One obvious problem is that this is tedious code.   It is error prone code.    Adding helper functions like GetDtoNullSafe help out immensely, but there is still an easier way. We can bypass the tedious code, by pass the complex inheritance tricks, and reduce all of this to a single method in the base class. TransferObject dto = new TransferObject(); CopyTo (this, dto); return dto; In the case of this one project, such a change eliminated the need for 20% of the total code base and a whole class of unit test cases that made sure that all of the properties were in synch. The impact of such a change also needs to include the on going time savings and the improvements in quality that can arise from them.    Developers who are not worried about keeping the properties in synch across mirrored object hierarchies are freed to worry about more important things like implementing business requirements.

    Read the article

  • MIX 2010 Covert Operations Day 3

    - by GeekAgilistMercenary
    I rolled over to the Mandalay for breakfast.  There I met a couple guys that were really excited about the new Windows 7 Phone.  They, as I, are also hopeful that the phone really gets a big push and some penetration into the market.  Not because we don’t like any other of the phones, but because this phone is so much better in many ways.  From a developer's perspective creating applications in Windows 7 Mobile will be vastly superior in ease, capabilities, and other aspects.  The architectural, existing code base, examples, and provisions to create things on the 7 Mobile Device are already existing as of RIGHT NOW.  There is no reason, except for fickle market conditions, for this phone to not just explode onto the market.  But alas, I won't hold my breath. Day three keynote had a whole new slew of things provided.  It also seemed that things got a lot more technical on this second keynote.  The oData was one of the very technical bits, yet it included almost no code.  Starting with a Netflix example and all the way to the Codename "Dallas" effort the oData Services provide some expansive possibilities. A mash up going 4 ways was then shown for finding a movie, finding local places to have a viewing, and information about the movie and were to prospectively find and buy additional movie bits.  The display was of course, in a Windows 7 Mobile device with literally a click to view each set of data.  The backend and the front end of this was beautifully smooth. The Dallas Project has a lot of potential for analytics in dashboard and scorecard creation also.  If there is a need or reason to provide data to a vast and wide range of clients, Dallas is a prime example of how to do that. Azure Clouds After the main keynote I checked out (while developing a working WPF & Silverlight Application for work) the session on deploying ASP.NET Applications, services, etc, into the cloud.  The session was pretty good, but I'll admit I got a little unfocused from it a few times.  It is after all hard to do two things at one time. I did take note that the cloud still is a multiple step process for deploying to.  This is a good thing and a bad thing.  There needs to be more checks and verifications when deploying something into the cloud just for technical reasons.  However, I feel that there should be some streamlining to the process.  Going back and forth between web and Visual Studio as the interface also seems kind of clunky.  Deployment should be able to be completed from within Visual Studio in my perspective.  Overall, the cloud is getting more and more impressive in function as well as theory. That's it from me so far on the third day of MIX.  I'll be note taking and studying hard to have more good tidbits to provide. Thanks for reading, if you're curious about more of my writing, check out this original entry at my other blog Agilist Mercenary.

    Read the article

  • Get to Know a Candidate (8 of 25): Rocky Anderson&ndash;Justice Party

    - by Brian Lanham
    DISCLAIMER: This is not a post about “Romney” or “Obama”. This is not a post for whom I am voting. Information sourced for Wikipedia. Ross Carl “Rocky” Anderson served two terms as the 33rd mayor of Salt Lake City, Utah, between 2000 and 2008.  He is the Executive Director of High Road for Human Rights.  Prior to serving as Mayor, he practiced law for 21 years in Salt Lake City, during which time he was listed in Best Lawyers in America, was rated A-V (highest rating) by Martindale-Hubbell, served as Chair of the Utah State Bar Litigation Section[4] and was Editor-in-Chief of, and a contributor to, Voir Dire legal journal. As mayor, Anderson rose to nationwide prominence as a champion of several national and international causes, including climate protection, immigration reform, restorative criminal justice, LGBT rights, and an end to the "war on drugs". Before and after the invasion by the U.S. of Iraq in 2003, Anderson was a leading opponent of the invasion and occupation of Iraq and related human rights abuses. Anderson was the only mayor of a major U.S. city who advocated for the impeachment of President George W. Bush, which he did in many venues throughout the United States. Anderson's work and advocacy led to local, national, and international recognition in numerous spheres, including being named by Business Week as one of the top twenty activists in the world on climate change,serving on the Newsweek Global Environmental Leadership Advisory Board, and being recognized by the Human Rights Campaign as one of the top ten straight advocates in the United States for LGBT equality. He has also received numerous awards for his work, including the EPA Climate Protection Award, the Sierra Club Distinguished Service Award, the Respect the Earth Planet Defender Award, the National Association of Hispanic Publications Presidential Award, The Drug Policy Alliance Richard J. Dennis Drugpeace Award, the Progressive Democrats of America Spine Award, the League of United Latin American Citizens Profile in Courage Award, the Bill of Rights Defense Committee Patriot Award, the Code Pink (Salt Lake City) Pink Star honor, the Morehouse University Gandhi, King, Ikeda Award, and the World Leadership Award for environmental programs. Formerly a member of the Democratic Party, Anderson expressed his disappointment with that Party in 2011, stating, “The Constitution has been eviscerated while Democrats have stood by with nary a whimper. It is a gutless, unprincipled party, bought and paid for by the same interests that buy and pay for the Republican Party." Anderson announced his intention to run for President in 2012 as a candidate for the newly-formed Justice Party. Although founded by Rocky Anderson of Utah, the Justice Party was first recognized by Mississippi and describes itself as advocating economic justice through measures such as green jobs and a right to organize, environment justice through enforcing employee safeguards in trade agreements, and social and civic justice through universal health care. In its first press release, the Utah Justice Party set forth its goals for justice in the economic, environmental, social and civic realms, along with a call to rid the corrupting influence of big money from government, to reverse the erosion of rights guaranteed by the Constitution, and to stop draining American resources to support illegal wars of aggression. Its press release says its grassroots supporters believe that now is the time for all to "shed their skeptical view that their voices don't matter", that "our 2-party system is a 'duopoly' controlled by the same corporate and military interests", and that the people must act to ensure "that our nation will achieve a brighter, sustainable future.” Anderson has ballot access in CO, CT, FL, ID, LA, MI, MN, MS, NJ, NM, OR, RI, TN, UT, VT, WA (152 electoral votes) and has write-in access in AL, AK, DE, GA, IL, IO, KS, MD, MO, NE, NH, NY, PA, TX Learn more about Rocky Anderson and Justice Party on Wikipedia.

    Read the article

  • Get Your Enterprise Working With Oracle On Track Communication 1.0

    - by Josh Lannin
    The On Track Development team is very pleased to announce that today On Track is available for our customers to download and evaluate.  To learn more about what On Track does start with our whitepaper and datasheet.   If you are a developer, take a look at our documentation and samples posted to our OTN page. For this first blog post, I’ll be speaking to several notable points about our product. Graceful Escalation via Conversations: On Track addresses the “Collaboration Problem” through a single guiding principle – graceful escalation – within the construct of a Conversation. In On Track, collaboration is based on a context (called a “Conversation”) that gracefully escalates in form, structure, and content, as dictated by the particular needs of a given collaboration.  Within that context, On Track provides a rich set of tools to choose from.  These tools provide for communication, coordination, content management, organization, decision making, and analysis -- all essential aspects of collaboration, but not all of them are essential all of the time.  Every collaborative interaction will evolve differently.  Some will evolve to represent work spreading over the course of years and involving a large, distributed team, while others may involve few people and not evolve at all.  Regardless, all collaborative contexts are built from the same parts, utilize the same concepts, and start the same way.  The principle of graceful escalation is that you only use the tools and structure you need; so you only incur the complexity you need. Purposeful Collaboration: Through application integration, On Track Conversations bring enterprise application users the communication and collaboration capabilities required to complete business process.  By association with specific processes or business objects conversations extend the possible interactions and broaden participation to internal or external non-application users and provide a sophisticated interaction experience, all the while enhancing the data set within the owning application.  Purposeful collaboration not only needs to happen in the context of applications, it must support a full range of real-time and long-running interactions to provide the greatest value. Multi Client, Multi Modal: This On Track 1.0 product release includes the same day availability of  multiple clients, including iPhone and iPad applications which are now available on the Apple Store, a fully capable and accessible Outlook Add-In, along with our browser web client.  With each client we have sought to leverage the strengths of each unique device- our iPhone client supports picture and voice posts, the iPad is optimized for meeting room situations and document viewing, and our Outlook add-in allows you to take emails in context and bring them into On Track.  In addition to supporting a diverse array of clients, On Track provides a unified multi modal experience support starting with basic messages moving through to integrated documents with live annotations, snapshots, application sharing, and voice. Next Generation Web Architecture: We believe On Track will help move the bar higher for what users can expect from all web applications, most notably ones that involve real-time activity.  On Track is built from the ground up with an innovative, real-time architecture that leverages the extensive push capabilities of our server.  Whether you are receiving a new message, viewing where crowds of people are collaborating, or doing live annotation on a document with a set of people, that information comes to you immediately without refreshes or moving back and forth between pages.  We’ve leveraged this core architecture across the product experience and raised the user experience bar for this type of application.  As well these capabilities are based on open standards and protocols, and are fully extensible by anyone- enabling sophisticated integrations to be created with a wide variety of both legacy and next-generation applications. Agile Product Development: As a product team we operate using continuous feedback and modified agile development methodologies.  We have thousands of active internal Oracle users who have helped pilot our product for critical business functions, and the On Track product development team uses our product as our primary vehicle for all our collaboration.  Additionally we been working with early access customers who are adopting our technology and providing us valuable feedback - which our process has rapidly realized in improvements to our software.  On Track agility extends to our server as well, which is built to scale, and is very simple to install and configure. We are pleased to make this product announcement and encourage you to join us on Facebook or follow us on Twitter, as well as checking back here for the latest product information.

    Read the article

< Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >