Search Results

Search found 8397 results on 336 pages for 'kendo ui'.

Page 306/336 | < Previous Page | 302 303 304 305 306 307 308 309 310 311 312 313  | Next Page >

  • Coping with infrastructure upgrades

    - by Fatherjack
    A common topic for questions on SQL Server forums is how to plan and implement upgrades to SQL Server. Moving from old to new hardware or moving from one version of SQL Server to another. There are other circumstances where upgrades of other systems affect SQL Server DBAs. For example, where I work at the moment there is an Microsoft Exchange (email) server upgrade in progress. It it being handled by a different team so I’m not wholly sure on the details but we are in a situation where there are currently 2 Exchange email servers – the old one and the new one. Users mail boxes are being transferred in a planned process but as we approach the old server being turned off we have to also make sure that our SQL Servers get updated to use the new SMTP server for all of the SQL Agent notifications, SSIS packages etc. My servers have a number of profiles so that various jobs can send emails on behalf of various departments and different systems. This means there are lots of places that the old server name needs to be replaced by the new one. Anyone who has set up DBMail and enjoyed the click-tastic odyssey of screens to create Profiles and Accounts and so on and so forth ought to seek some professional help in my opinion. It’s a nightmare of back and forth settings changes and it stinks. I wasn’t looking forward to heading into this mess of a UI and changing the old Exchange server name for the new one on all my SQL Instances for all of the accounts I have set up. So I did what any Englishmen with a shed would do, I decided to take it apart and see if I can fix it another way. I took a guess that we are going to be working in MSDB and Books OnLine was remarkably helpful and amongst a lot of information told me about a couple of procedures that can be used to interrogate DBMail settings. USE [msdb] -- It's where all the good stuff is kept GO EXEC dbo.sysmail_help_profile_sp; EXEC dbo.sysmail_help_account_sp; Both of these procedures take optional parameters with the same name – ID and Name. If you provide an ID or a name then the results you get back are for that specific Profile or Account. Otherwise you get details of all Profiles and Accounts on the server you are connected to. As you can see (click for a bigger image), the Account has the SMTP server information in the servername column. We want to change that value to NewSMTP.Contoso.com. Now it appears that the procedure we are looking at gets it’s data from the sysmail_account and sysmail_server tables, you can get the results the stored procedure provides if you run the code below. SELECT [account_id] , [name] , [description] , [email_address] , [display_name] , [replyto_address] , [last_mod_datetime] , [last_mod_user] FROM dbo.sysmail_account AS sa; SELECT [account_id] , [servertype] , [servername] , [port] , [username] , [credential_id] , [use_default_credentials] , [enable_ssl] , [flags] , [last_mod_datetime] , [last_mod_user] , [timeout] FROM dbo.sysmail_server AS sms Now, we have no real idea how these tables are linked and whether making an update direct to one or other of them is going to do what we want or whether it will entirely cripple our ability to send email from SQL Server so we wont touch those tables with any UPDATE TSQL. So, back to Books OnLine then and we find sysmail_update_account_sp. It’s exactly what we need. The examples in BOL take the form (as below) of having every parameter explicitly defined. Not wanting to totally obliterate the existing values by not passing values in all of the parameters I set to writing some code to gather the existing data from the tables and re-write the SMTP server name and then execute the resulting TSQL. IF OBJECT_ID('tempdb..#sysmailprofiles') IS NOT NULL DROP TABLE #sysmailprofiles GO CREATE TABLE #sysmailprofiles ( account_id INT , [name] VARCHAR(50) , [description] VARCHAR(500) , email_address VARCHAR(500) , display_name VARCHAR(500) , replyto_address VARCHAR(500) , servertype VARCHAR(10) , servername VARCHAR(100) , port INT , username VARCHAR(100) , use_default_credentials VARCHAR(1) , ENABLE_ssl VARCHAR(1) ) INSERT [#sysmailprofiles] ( [account_id] , [name] , [description] , [email_address] , [display_name] , [replyto_address] , [servertype] , [servername] , [port] , [username] , [use_default_credentials] , [ENABLE_ssl] ) EXEC [dbo].[sysmail_help_account_sp] DECLARE @TSQL NVARCHAR(1000) SELECT TOP 1 @TSQL = 'EXEC [dbo].[sysmail_update_account_sp] @account_id = ' + CAST([s].[account_id] AS VARCHAR(20)) + ', @account_name = ''' + [s].[name] + '''' + ', @email_address = N''' + [s].[email_address] + '''' + ', @display_name = N''' + [s].[display_name] + '''' + ', @replyto_address = N''' + s.replyto_address + '''' + ', @description = N''' + [s].[description] + '''' + ', @mailserver_name = ''NEWSMTP.contoso.com''' + +', @mailserver_type = ' + [s].[servertype] + ', @port = ' + CAST([s].[port] AS VARCHAR(20)) + ', @username = ' + COALESCE([s].[username], '''''') + ', @use_default_credentials =' + CAST(s.[use_default_credentials] AS VARCHAR(1)) + ', @enable_ssl =' + [s].[ENABLE_ssl] FROM [#sysmailprofiles] AS s WHERE [s].[servername] = 'SMTP.Contoso.com' SELECT @tsql EXEC [sys].[sp_executesql] @tsql This worked well for me and testing the email function EXEC dbo.sp_send_dbmail afterwards showed that the settings were indeed using our new Exchange server. It was only later in writing this blog that I tried running the sysmail_update_account_sp procedure with only the SMTP server name parameter value specified. Despite what Books OnLine might intimate, you can do this and only the values for parameters specified get changed. If a parameter is not specified in the execution of the procedure then the values remain unchanged. This renders most of the above script unnecessary as I could have simply specified the account_id that I want to amend and the new value for the parameter I want to update. EXEC sysmail_update_account_sp @account_id = 1, @mailserver_name = 'NEWSMTP.Contoso.com' This wasn’t going to be the main reason for this post, it was meant to describe how to capture values from a stored procedure and use them in dynamic TSQL but instead we are here and (re)learning the fact that Books Online is a little flawed in places. It is a fantastic resource for anyone working with SQL Server but the reader must adopt an enquiring frame of mind and use a little curiosity to try simple variations on examples to fully understand the code you are working with. I think the author(s) of this part of Books OnLine missed an opportunity to include a third example that had fewer than all parameters specified to give a lead to this method existing.

    Read the article

  • Can not print after upgrading from 12.x to 14.04

    - by user318889
    After upgrading from V12.04 to V14.04 I am not able to print. I am using an HP LaserJet 400 M451dn. The printer troubleshooter told me that there is no solution to the problem. This is the output of the advanced diagnositc output. (Due to limited space I cut the output!) Can anybody tell me what is going wrong. I am using the printer via USB ? Page 1 (Scheduler not running?): {'cups_connection_failure': False} Page 2 (Is local server publishing?): {'local_server_exporting_printers': False} Page 3 (Choose printer): {'cups_dest': , 'cups_instance': None, 'cups_queue': u'HP-LaserJet-400-color-M451dn', 'cups_queue_listed': True} Page 4 (Check printer sanity): {'cups_device_uri_scheme': u'hp', 'cups_printer_dict': {'device-uri': u'hp:/usb/HP_LaserJet_400_color_M451dn?serial=CNFF308670', 'printer-info': u'Hewlett-Packard HP LaserJet 400 color M451dn', 'printer-is-shared': True, 'printer-location': u'Pinatubo', 'printer-make-and-model': u'HP LJ 300-400 color M351-M451 Postscript (recommended)', 'printer-state': 4, 'printer-state-message': u'', 'printer-state-reasons': [u'none'], 'printer-type': 8556636, 'printer-uri-supported': u'ipp://localhost:631/printers/HP-LaserJet-400-color-M451dn'}, 'cups_printer_remote': False, 'hplip_output': (['', '\x1b[01mHP Linux Imaging and Printing System (ver. 3.14.6)\x1b[0m', '\x1b[01mDevice Information Utility ver. 5.2\x1b[0m', '', 'Copyright (c) 2001-13 Hewlett-Packard Development Company, LP', 'This software comes with ABSOLUTELY NO WARRANTY.', 'This is free software, and you are welcome to distribute it', 'under certain conditions. See COPYING file for more details.', '', '', '\x1b[01mhp:/usb/HP_LaserJet_400_color_M451dn?serial=CNFF308670\x1b[0m', '', '\x1b[01mDevice Parameters (dynamic data):\x1b[0m', '\x1b[01m Parameter Value(s) \x1b[0m', ' ---------------------------- ----------------------------------------------------------', ' back-end hp ', " cups-printers ['HP-LaserJet-400-color-M451dn'] ", ' cups-uri hp:/usb/HP_LaserJet_400_color_M451dn?serial=CNFF308670 ', ' dev-file ', ' device-state -1 ', ' device-uri hp:/usb/HP_LaserJet_400_color_M451dn?serial=CNFF308670 ', ' deviceid ', ' error-state 101 ', ' host ', ' is-hp True ', ' panel 0 ', ' panel-line1 ', ' panel-line2 ', ' port 1 ', ' serial CNFF308670 ', ' status-code 5002 ', ' status-desc ', '\x1b[01m', 'Model Parameters (static data):\x1b[0m', '\x1b[01m Parameter Value(s) \x1b[0m', ' ---------------------------- ----------------------------------------------------------', ' align-type 0 ', ' clean-type 0 ', ' color-cal-type 0 ', ' copy-type 0 ', ' embedded-server-type 0 ', ' fax-type 0 ', ' fw-download False ', ' icon hp_color_laserjet_cp2025.png ', ' io-mfp-mode 1 ', ' io-mode 1 ', ' io-support 6 ', ' job-storage 0 ', ' linefeed-cal-type 0 ', ' model HP_LaserJet_400_color_M451dn ', ' model-ui HP LaserJet 400 Color m451dn ', ' model1 HP LaserJet 400 Color M451dn ', ' monitor-type 0 ', ' panel-check-type 0 ', ' pcard-type 0 ', ' plugin 0 ', ' plugin-reason 0 ', ' power-settings 0 ', ' ppd-name lj_300_400_color_m351_m451 ', ' pq-diag-type 0 ', ' r-type 0 ', ' r0-agent1-kind 4 ', ' r0-agent1-sku CE410A/CE410X ', ' r0-agent1-type 1 ', ' r0-agent2-kind 4 ', ' r0-agent2-sku CE411A ', ' r0-agent2-type 4 ', ' r0-agent3-kind 4 ', ' r0-agent3-sku CE413A ', ' r0-agent3-type 5 ', ' r0-agent4-kind 4 ', ' r0-agent4-sku CE412A ', ' r0-agent4-type 6 ', ' scan-src 0 ', ' scan-type 0 ', ' status-battery-check 0 ', ' status-dynamic-counters 0 ', ' status-type 3 ', ' support-released True ', ' support-subtype 2202411 ', ' support-type 2 ', ' support-ver 3.12.2 ', " tech-class ['Postscript'] ", " tech-subclass ['Normal'] ", ' tech-type 4 ', ' usb-pid 3882 ', ' usb-vid 1008 ', ' wifi-config 0 ', '\x1b[01m', 'Status History (most recent first):\x1b[0m', '\x1b[01m Date/Time Code Status Description User Job ID \x1b[0m', ' -------------------- ----- ---------------------------------------- -------- --------', ' 08/21/14 00:07:25 5012 Device communication error richard 0 ', ' 08/20/14 13:42:44 500 Started a print job richard 4214 ', '', '', 'Done.', ''], ['\x1b[35;01mwarning: No display found.\x1b[0m', '\x1b[31;01merror: hp-info -u/--gui requires Qt4 GUI support. Entering interactive mode.\x1b[0m', '\x1b[31;01merror: Unable to communicate with device (code=12): hp:/usb/HP_LaserJet_400_color_M451dn?serial=CNFF308670\x1b[0m', '\x1b[31;01merror: Error opening device (Device not found).\x1b[0m', ''], 0), 'is_cups_class': False, 'local_cups_queue_attributes': {'charset-configured': u'utf-8', 'charset-supported': [u'us-ascii', u'utf-8'], 'color-supported': True, 'compression-supported': [u'none', u'gzip'], 'copies-default': 1, 'copies-supported': (1, 9999), 'cups-version': u'1.7.2', 'device-uri': u'hp:/usb/HP_LaserJet_400_color_M451dn?serial=CNFF308670', 'document-format-default': u'application/octet-stream', 'document-format-supported': [u'application/octet-stream', u'application/pdf', u'application/postscript', u'application/vnd.adobe-reader-postscript', u'application/vnd.cups-command', u'application/vnd.cups-pdf', u'application/vnd.cups-pdf-banner', u'application/vnd.cups-postscript', u'application/vnd.cups-raw', u'application/vnd.samsung-ps', u'application/x-cshell', u'application/x-csource', u'application/x-perl', u'application/x-shell', u'image/gif', u'image/jpeg', u'image/png', u'image/tiff', u'image/urf', u'image/x-bitmap', u'image/x-photocd', u'image/x-portable-anymap', u'image/x-portable-bitmap', u'image/x-portable-graymap', u'image/x-portable-pixmap', u'image/x-sgi-rgb', u'image/x-sun-raster', u'image/x-xbitmap', u'image/x-xpixmap', u'image/x-xwindowdump', u'text/css', u'text/html', u'text/plain'], 'finishings-default': 3, 'finishings-supported': [3], 'generated-natural-language-supported': [u'en-us'], 'ipp-versions-supported': [u'1.0', u'1.1', u'2.0', u'2.1'], 'ippget-event-life': 15, 'job-creation-attributes-supported': [u'copies', u'finishings', u'ipp-attribute-fidelity', u'job-hold-until', u'job-name', u'job-priority', u'job-sheets', u'media', u'media-col', u'multiple-document-handling', u'number-up', u'output-bin', u'orientation-requested', u'page-ranges', u'print-color-mode', u'print-quality', u'printer-resolution', u'sides'], 'job-hold-until-default': u'no-hold', 'job-hold-until-supported': [u'no-hold', u'indefinite', u'day-time', u'evening', u'night', u'second-shift', u'third-shift', u'weekend'], 'job-ids-supported': True, 'job-k-limit': 0, 'job-k-octets-supported': (0, 470914416), 'job-page-limit': 0, 'job-priority-default': 50, 'job-priority-supported': [100], 'job-quota-period': 0, 'job-settable-attributes-supported': [u'copies', u'finishings', u'job-hold-until', u'job-name', u'job-priority', u'media', u'media-col', u'multiple-document-handling', u'number-up', u'output-bin', u'orientation-requested', u'page-ranges', u'print-color-mode', u'print-quality', u'printer-resolution', u'sides'], 'job-sheets-default': (u'none', u'none'), 'job-sheets-supported': [u'none', u'classified', u'confidential', u'form', u'secret', u'standard', u'topsecret', u'unclassified'], 'jpeg-k-octets-supported': (0, 470914416), 'jpeg-x-dimension-supported': (0, 65535), 'jpeg-y-dimension-supported': (1, 65535), 'marker-change-time': 0, 'media-bottom-margin-supported': [423], 'media-col-default': u'(unknown IPP value tag 0x34)', 'media-col-supported': [u'media-bottom-margin', u'media-left-margin', u'media-right-margin', u'media-size', u'media-source', u'media-top-margin', u'media-type'], 'media-default': u'iso_a4_210x297mm', 'media-left-margin-supported': [423], 'media-right-margin-supported': [423],

    Read the article

  • Custom Configuration Section Handlers

    Most .NET developers who need to store something in configuration tend to use appSettings for this purpose, in my experience.  More recently, the framework itself has helped things by adding the <connectionStrings /> section so at least these are in their own section and not adding to the appSettings clutter that pollutes most apps.  I recommend avoiding appSettings for several reasons.  In addition to those listed there, I would add that strong typing and validation are additional reasons to go the custom configuration section route. For my ASP.NET Tips and Tricks talk, I use the following example, which is a simple DemoSettings class that includes two fields.  The first is an integer representing how many attendees there are present for the talk, and the second is the title of the talk.  The setup in web.config is as follows: <configSections> <section name="DemoSettings" type="ASPNETTipsAndTricks.Code.DemoSettings" /> </configSections>   <DemoSettings sessionAttendees="100" title="ASP.NET Tips and Tricks DevConnections Spring 2010" /> Referencing the values in code is strongly typed and straightforward.  Here I have a page that exposes two properties which internally get their values from the configuration section handler: public partial class CustomConfig1 : System.Web.UI.Page { public string SessionTitle { get { return DemoSettings.Settings.Title; } } public int SessionAttendees { get { return DemoSettings.Settings.SessionAttendees; } } } Note that the settings are only read from the config file once after that they are cached so there is no need to be concerned about excessive file access. Now weve seen how to set it up on the config file and how to refer to the settings in code.  All that remains is to see the file itself: public class DemoSettings : ConfigurationSection { private static DemoSettings settings = ConfigurationManager.GetSection("DemoSettings") as DemoSettings; public static DemoSettings Settings{ get { return settings;} }   [ConfigurationProperty("sessionAttendees" , DefaultValue = 200 , IsRequired = false)] [IntegerValidator(MinValue = 1 , MaxValue = 10000)] public int SessionAttendees { get { return (int)this["sessionAttendees"]; } set { this["sessionAttendees"] = value; } }   [ConfigurationProperty("title" , IsRequired = true)] [StringValidator(InvalidCharacters = "~!@#$%^&*()[]{}/;\"|\\")] public string Title { get { return (string)this["title"]; } set { this["title"] = value; }   } } The class is pretty straightforward, but there are some important components to note.  First, it must inherit from System.Configuration.ConfigurationSection.  Next, as a convention I like to have a static settings member that is responsible for pulling out the section when the class is first referenced, and further to expose this collection via a static readonly property, Settings.  Note that the types of both of these are the type of my class, DemoSettings. The properties of the class, SessionAttendees and Title, should map to the attributes of the config element in the XML file.  The [ConfigurationProperty] attribute allows you to map the attribute name to the property name (thus using both XML standard naming conventions and C# naming conventions).  In addition, you can specify a default value to use if nothing is specified in the config file, and whether or not the setting must be provided (IsRequired).  If it is required, then it doesnt make sense to include a default value. Beyond defaults and required, you can specify more advanced validation rules for the configuration values using additional C# attributes, such as [IntegerValidator] and [StringValidator].  Using these, you can declaratively specify that your configuration values be in a given range, or omit certain forbidden characters, for instance.  Of course you can write your own custom validation attributes, and there are others specified in System.Configuration. Individual sections can also be loaded from separate files, using syntax like this: <DemoSettings configSource="demosettings.config" /> Summary Using a custom configuration section handler is not hard.  If your application or component requires configuration, I recommend creating a custom configuration handler dedicated to your app or component.  Doing so will reduce the clutter in appSettings, will provide you with strong typing and validation, and will make it much easier for other developers or system administrators to locate and understand the various configuration values that are necessary for a given application. Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • GameplayScreen does not contain a definition for GraphicsDevice

    - by Dave Voyles
    Long story short: I'm trying to intergrate my game with Microsoft's Game State Management. In doing so I've run into some errors, and the latest one is in the title. I'm not able to display my HUD for the reasons listed above. Previously, I had much of my code in my Game.cs class, but the GSM has a bit of it in Game1, and most of what you have drawn for the main screen in your GameplayScreen class, and that is what is causing confusion on my part. I've created an instance of the GameplayScreen class to be used in the HUD class (as you can see below). Before integrating with the GSM however, I created an instance of my Game class, and all worked fine. It seems that I need to define my graphics device somewhere, but I am not sure of where exactly. I've left some code below to help you understand. public class GameStateManagementGame : Microsoft.Xna.Framework.Game { #region Fields GraphicsDeviceManager graphics; ScreenManager screenManager; // Creates a new intance, which is used in the HUD class public static Game Instance; // By preloading any assets used by UI rendering, we avoid framerate glitches // when they suddenly need to be loaded in the middle of a menu transition. static readonly string[] preloadAssets = { "gradient", }; #endregion #region Initialization /// <summary> /// The main game constructor. /// </summary> public GameStateManagementGame() { Content.RootDirectory = "Content"; graphics = new GraphicsDeviceManager(this); graphics.PreferredBackBufferWidth = 1280; graphics.PreferredBackBufferHeight = 720; graphics.IsFullScreen = false; graphics.ApplyChanges(); // Create the screen manager component. screenManager = new ScreenManager(this); Components.Add(screenManager); // Activate the first screens. screenManager.AddScreen(new BackgroundScreen(), null); //screenManager.AddScreen(new MainMenuScreen(), null); screenManager.AddScreen(new PressStartScreen(), null); } namespace Pong { public class HUD { public void Update(GameTime gameTime) { // Used in the Draw method titleSafeRectangle = new Rectangle (GameplayScreen.Instance.GraphicsDevice.Viewport.TitleSafeArea.X, GameplayScreen.Instance.GraphicsDevice.Viewport.TitleSafeArea.Y, GameplayScreen.Instance.GraphicsDevice.Viewport.TitleSafeArea.Width, GameplayScreen.Instance.GraphicsDevice.Viewport.TitleSafeArea.Height); } } } class GameplayScreen : GameScreen { #region Fields ContentManager content; public static GameStates gamestate; private GraphicsDeviceManager graphics; public int screenWidth; public int screenHeight; private Texture2D backgroundTexture; private SpriteBatch spriteBatch; private Menu menu; private SpriteFont arial; private HUD hud; Animation player; // Creates a new intance, which is used in the HUD class public static GameplayScreen Instance; public GameplayScreen() { TransitionOnTime = TimeSpan.FromSeconds(1.5); TransitionOffTime = TimeSpan.FromSeconds(0.5); } protected void Initialize() { lastScored = false; menu = new Menu(); resetTimer = 0; resetTimerInUse = true; ball = new Ball(content, new Vector2(screenWidth, screenHeight)); SetUpMulti(); input = new Input(); hud = new HUD(); // Places the powerup animation inside of the surrounding box // Needs to be cleaned up, instead of using hard pixel values player = new Animation(content.Load<Texture2D>(@"gfx/powerupSpriteSheet"), new Vector2(103, 44), 64, 64, 4, 5); // Used by for the Powerups random = new Random(); vec = new Vector2(100, 50); vec2 = new Vector2(100, 100); promptVec = new Vector2(50, 25); timer = 10000.0f; // Starting value for the cooldown for the powerup timer timerVector = new Vector2(10, 10); //JEP - one time creation of powerup objects playerOnePowerup = new Powerup(); playerOnePowerup.Activated += PowerupActivated; playerOnePowerup.Deactivated += PowerupDeactivated; playerTwoPowerup = new Powerup(); playerTwoPowerup.Activated += PowerupActivated; playerTwoPowerup.Deactivated += PowerupDeactivated; //JEP - moved from events since these only need set once activatedVec = new Vector2(100, 125); deactivatedVec = new Vector2(100, 150); powerupReady = false; }

    Read the article

  • AppHarbor - Azure Done Right AKA Heroku for .NET

    - by Robz / Fervent Coder
    Easy and Instant deployments and instant scale for .NET? Awhile back a few of us were looking at Ruby Gems as the answer to package management for .NET. The gems platform supported the concept of DLLs as packages although some changes would have needed to happen to have long term use for the entire community. From that we formed a partnership with some folks at Microsoft to make v2 into something that would meet wider adoption across the community, which people now call NuGet. So now we have the concept of package management. What comes next? Heroku Instant deployments and instant scaling. Stupid simple API. This is Heroku. It doesn’t sound like much, but when you think of how fast you can go from an idea to having someone else tinker with it, you can start to see its power. In literally seconds you can be looking at your rails application deployed and online. Then when you are ready to scale, you can do that. This is power. Some may call this “cloud-computing” or PaaS (Platform as a Service). I first ran into Heroku back in July when I met Nick of RubyGems.org. At the time there was no alternative in the .NET-o-sphere. I don’t count Windows Azure, mostly because it is not simple and I don’t believe there is a free version. Heroku itself would not lend itself well to .NET due to the nature of platforms and each language’s specific needs (solution stack).  So I tucked the idea in the back of my head and moved on. AppHarbor Enters The Scene I’m not sure when I first heard about AppHarbor as a possible .NET version of Heroku. It may have been in November, but I didn’t actually try it until January. I was instantly hooked. AppHarbor is awesome! It still has a ways to go to be considered Heroku for .NET, but it already has a growing community. I created a video series (at the bottom of this post) that really highlights how fast you can get a product onto the web and really shows the power and simplicity of AppHarbor. Deploying is as simple as a git/hg push to appharbor. From there they build your code, run any unit tests you have and deploy it if everything succeeds. The screen on the right shows a simple and elegant UI to getting things done. The folks at AppHarbor graciously gave me a limited number of invites to hand out. If you are itching to try AppHarbor then navigate to: https://appharbor.com/account/new?inviteCode=ferventcoder  After playing with it, send feedback if you want more features. Go vote up two features I want that will make it more like Heroku. Disclaimer: I am in no way affiliated with AppHarbor and have not received any funds or favors from anyone at AppHarbor. I just think it is awesome and I want others to know about it. From Zero To Deployed in 15 Minutes (Or Less) Now I have a challenge for you. I created a video series showing how fast I could go from nothing to a deployed application. It could have been from Zero to Deployed in Less than 5 minutes, but I wanted to show you the tools a little more and give you an opportunity to beat my time. And that’s the challenge. Beat my time and show it in a video response. The video series is below (at least one of the videos has to be watched on YouTube). The person with the best time by March 15th @ 11:59PM CST will receive a prize. Ground rules: .NET Application with a valid database connection Start from Zero Deployed with AppHarbor or an alternative A timer displayed in the video that runs during the entire process Video response published on YouTube or acceptable alternative Video(s) must be published by March 15th at 11:59PM CST. Either post the link here as a comment or on YouTube as a response (also by 11:59PM CST March 15th) From Zero To Deployed In 15 Minutes (Or Less) Part 1 From Zero To Deployed In 15 Minutes (Or Less) Part 2 From Zero To Deployed In 15 Minutes (Or Less) Part 3

    Read the article

  • How many developers before continuous integration becomes effective for us?

    - by Carnotaurus
    There is an overhead associated with continuous integration, e.g., set up, re-training, awareness activities, stoppage to fix "bugs" that turn out to be data issues, enforced separation of concerns programming styles, etc. At what point does continuous integration pay for itself? EDIT: These were my findings The set-up was CruiseControl.Net with Nant, reading from VSS or TFS. Here are a few reasons for failure, which have nothing to do with the setup: Cost of investigation: The time spent investigating whether a red light is due a genuine logical inconsistency in the code, data quality, or another source such as an infrastructure problem (e.g., a network issue, a timeout reading from source control, third party server is down, etc., etc.) Political costs over infrastructure: I considered performing an "infrastructure" check for each method in the test run. I had no solution to the timeout except to replace the build server. Red tape got in the way and there was no server replacement. Cost of fixing unit tests: A red light due to a data quality issue could be an indicator of a badly written unit test. So, data dependent unit tests were re-written to reduce the likelihood of a red light due to bad data. In many cases, necessary data was inserted into the test environment to be able to accurately run its unit tests. It makes sense to say that by making the data more robust then the test becomes more robust if it is dependent on this data. Of course, this worked well! Cost of coverage, i.e., writing unit tests for already existing code: There was the problem of unit test coverage. There were thousands of methods that had no unit tests. So, a sizeable amount of man days would be needed to create those. As this would be too difficult to provide a business case, it was decided that unit tests would be used for any new public method going forward. Those that did not have a unit test were termed 'potentially infra red'. An intestesting point here is that static methods were a moot point in how it would be possible to uniquely determine how a specific static method had failed. Cost of bespoke releases: Nant scripts only go so far. They are not that useful for, say, CMS dependent builds for EPiServer, CMS, or any UI oriented database deployment. These are the types of issues that occured on the build server for hourly test runs and overnight QA builds. I entertain that these to be unnecessary as a build master can perform these tasks manually at the time of release, esp., with a one man band and a small build. So, single step builds have not justified use of CI in my experience. What about the more complex, multistep builds? These can be a pain to build, especially without a Nant script. So, even having created one, these were no more successful. The costs of fixing the red light issues outweighed the benefits. Eventually, developers lost interest and questioned the validity of the red light. Having given it a fair try, I believe that CI is expensive and there is a lot of working around the edges instead of just getting the job done. It's more cost effective to employ experienced developers who do not make a mess of large projects than introduce and maintain an alarm system. This is the case even if those developers leave. It doesn't matter if a good developer leaves because processes that he follows would ensure that he writes requirement specs, design specs, sticks to the coding guidelines, and comments his code so that it is readable. All this is reviewed. If this is not happening then his team leader is not doing his job, which should be picked up by his manager and so on. For CI to work, it is not enough to just write unit tests, attempt to maintain full coverage, and ensure a working infrastructure for sizable systems. The bottom line: One might question whether fixing as many bugs before release is even desirable from a business prespective. CI involves a lot of work to capture a handful of bugs that the customer could identify in UAT or the company could get paid for fixing as part of a client service agreement when the warranty period expires anyway.

    Read the article

  • Microsoft BUILD 2013 Day 1&ndash;Keynote

    - by Tim Murphy
    Originally posted on: http://geekswithblogs.net/tmurphy/archive/2013/06/27/microsoft-build-2013-day-1ndashkeynote.aspx This one is going to be a little long because the keynote was jam-packed so bare with me. The keynote for the first day of BUILD 2013 was kicked off by Steve Balmer.  He made it very clear that Microsoft’s focus is on accelerating its time to market with products and product updates.  His quote was that “Rapid release” is the new norm.  He continued by showing off several new Lumias that have been buzzing around the internet for a while and announce that Sprint will now be carrying the HTC 8XT and Samsung ATIV. Balmer is known for repeating words or phrase for affect.  This time it was “Rapid release, rapid release” and “Touch, touch, touch, touch, touch, …”.  This was fun, but even more fun was when he announce that all attendees would receive an Acer Iconia 8” tablet. SCORE! The next subject Balmer focused on is new apps.  The three new ones were Flipboard, Facebook and NFL Fantasy Football.  I liked the first two because these are ones that people coming from other platforms are missing.  The NFL app is great just because it targets a demographic that can be fanatical.  If these types of apps keep coming than the missing app argument goes away. While many Negative Nancy’s are describing Windows 8.1 as Windows 180 Steve Balmer chose to call it a “refined blend” as in a coffee that has been improved with a new mix.  This includes more multi-tasking options and leveraging Bing straight throughout the entire ecosystem. He ended this first section by explaining that this will also bring more Bing development opportunities to the community. Steve Balmer was followed by Julie Larson-Green who spent her time on stage selling us on Windows 8 all over again from my point of view.  Something that I would not have thought was needed until I had listened to some other attendees who had a number of concerns and complaints.  She showed a number of new gestures that will come with Windows 8.1, and while they were cool I was left wondering if they really improved the experience.  I guess only time will tell. I did like the fact that it the UI implementation to bring up “All Apps” now mirrors that of Windows Phone.  The consistency is a big step forward that I hope to see continue.  The cool factor went up from there as she swiped content from a desktop (mega-tablet) to the XBox One.  This seamless experience I believe is what is really needed for any future platform to be relevant. I was much more enthused by the presentation of Antoine Leblond who humbled us by letting us know that there are 5k new API.  How that can be or how anyone would ever use all of them is another question.  His announcement was that the Visual Studio 2013 preview would be available today along with the Windows 8.1 bits.  One of the features of VS2013 that he demonstrated is the power consumption profiler.  With battery life being a key factor with consumer consumption devices this is a welcome addition. He didn’t limit his presentation to VS2013 features though.  He showed how the Store has been redesigned to enable better search and discoverability of apps and how Win 8.1 can perform multiple screen scales depending on the resolution of the device automatically.  The last feature he demoed was the real time video streaming API which he made sure we understood by attaching a Surface to a little robot.  Oh, but there was one more thing.  Antoine and Julie announce that all attendees would also be getting Surface Pros.  BONUS! How much more could there be?  Gurdeep Singh Pall was about to pile on.  He introduced us to Bing as a platform (BaaP?).  He said if they (Microsoft) could do something with and API that is good 3rd party developers can do something that is dynamite and showed us some of the tools they had produced.  These included natural user interface improvements such as voice commands that looked to put Siri to shame.  Add to that 3D, OCR and translation capabilities and the future looks to be full of opportunities. Balmer then came out to show us one last thing.  Project Spark is a game design environment that will be available for Windows 8.1, XBox 360 and XBox One.  All I can say is that if my kids get their hands on this they are going to be able to learn some of what dad does in a much more enjoyable way. At the end of it all I was both exhausted and energized by what I saw.  What could they have possibly left for the day 2 keynote?  I hear it will feature Scott Hanselman.  If that is right we are in for a treat.  See you there. del.icio.us Tags: BUILD 2013,Windows 8.1,Winodws Phone,XAML,Keynote,Bing,Visual Studio 2013,Project Spark

    Read the article

  • On Reflector Pricing

    - by Nick Harrison
    I have heard a lot of outrage over Red Gate's decision to charge for Reflector. In the interest of full disclosure, I am a fan of Red Gate. I have worked with them on several usability tests. They also sponsor Simple Talk where I publish articles. They are a good company. I am also a BIG fan of Reflector. I have used it since Lutz originally released it. I have written my own add-ins. I have written code to host reflector and use its object model in my own code. Reflector is a beautiful tool. The care that Lutz took to incorporate extensibility is amazing. I have never had difficulty convincing my fellow developers that it is a wonderful tool. Almost always, once anyone sees it in action, it becomes their favorite tool. This wide spread adoption and usability has made it an icon and pivotal pillar in the DotNet community. Even folks with the attitude that if it did not come out of Redmond then it must not be any good, still love it. It is ironic to hear everyone clamoring for it to be released as open source. Reflector was never open source, it was free, but you never were able to peruse the source code and contribute your own changes. You could not even use Reflector to view the source code. From the very beginning, it was never anyone's intention for just anyone to examine the source code and make their own contributions aside from the add-in model. Lutz chose to hand over the reins to Red Gate because he believed that they would be able to build on his original vision and keep the product viable and effective. He did not choose to make it open source, hoping that the community would be up to the challenge. The simplicity and elegance may well have been lost with the "design by committee" nature of open source. Despite being a wonderful and beloved tool, Reflector cannot be an easy tool to maintain. Maybe because it is so wonderful and beloved, it is even more difficult to maintain. At any rate, we have high expectations. Reflector must continue to be able to reasonably disassemble every language construct that the framework and core languages dream up. We want it to be fast, and we also want it to continue to be simple to use. No small order. Red Gate tried to keep the core product free. Sadly there was not enough interest in the Pro version to subsidize the rest of the expenses. $35 is a reasonable cost, more than reasonable. I have read the blog posts and forum posts complaining about the time associated with getting the expense approved. I have heard people complain about the cost being unreasonable if you are a developer from certain countries. Let's do the math. How much of a productivity boost is Reflector? How many hours do you think it saves you in a typical project? The next question is a little easier if you are a contractor or a consultant, but what is your hourly rate? If you are not a contractor, you can probably figure out an hourly rate. How long does it take to get a return on your investment? The value added proposition is not a difficult one to make. I have read people clamoring that Red Gate sucks and is evil. They complain about broken promises and conflicts of interest. Relax! Red Gate is not evil. The world is not coming to an end. The sun will come up tomorrow. I am sure that Red Gate will come up with options for volume licensing or site licensing for companies that want to get a licensed copy for their entire team. Don't panic, and I am sure that many great improvements are on the horizon. Switching the UI to WPF and including a tabbed interface opens up lots of possibilities.

    Read the article

  • Adaptive Layout for ADF Faces on Tablets

    - by Shay Shmeltzer
    In the 11.1.16 version of Oracle ADF we started adding specific features to the ADF Faces components so they'll work better on iPad tablets. In this entry I'm going to highlight some new capabilities that we have added to the 11.1.2.3 release. (note if you are still on the 11.1.1.* branch - you'll need to wait for 11.1.1.7 to get the features discussed here). The two key additions in the 11.1.2.3 version compared to the 11.1.1.6 features for iPad support include: pagination for tables and adaptive flow layout. The pagination for table is self explanatory, basically since iPad don't support scroll bars, we automatically switch the table component to render with a pagination toolbar that allow you to scroll set of records or directly jump to a specific set. See the image below. The adaptive flow layout takes a bit more explanation. On regular desktops the UI that you usually build for ADF Faces screens is going to use stretch layout - meaning that it stretches to fill the whole area of the browser window. If you resize the browser windoe, the ADF Faces page resizes with it. If your browser window is too small, scroll bars will appear to allow you to scroll to areas that are "hidden". However on an iPad, this is probably not the type of layout you want - you would rather have a flow layout that eliminates scroll bars and instead allows you to scroll down the page. Basically your want the page to be sized based on its content, rather then based on the browser window size. In ADF Faces terminology this can be done with the dimensionsFrom property set to "children". And here comes the tricky part, since in the past(and also today) when you create an ADF Faces page and add a stretchable component to it, the dimensionsFrom property is set to parent by default. This will be true to other layout components you'll add as well. At this point you might be wondering "Does this mean I'll need to go to each of the layout components in my page and modify the dimensionsFrom property value to be children?" ADF Faces to the rescue... To eliminate the need to do this tedious manual changes, we introduced a new web.xml parameter "oracle.adf.view.rich.geometry.DEFAULT_DIMENSIONS" You'll basically add the following to your web.xml <context-param>    <description>      This parameter controls the default value for component geometry on the page.      Supported values are:        legacy - component attributes use the default values as specified for the attributes                 in the tag documentation (default value)        auto   - component attributes use the correct default value given the value of their                 parent component. For example, with this setting, the panelStretchLayout                 will use "auto" as the default value for its "dimensionsFrom" attribute                 instead of "parent".    </description>    <param-name>oracle.adf.view.rich.geometry.DEFAULT_DIMENSIONS</param-name>    <param-value>auto</param-value>  </context-param> Once you set this parameter, you only need to set the dimensionsFrom attribute for the top level layout component on your page, and the rest of the components will adjust accordingly. One trick that you can use, and that is used in the demo below, is to have the dimensionsFrom property depend on the type of client that access your application. This way you can switch between stretch or flow layout based on the device accessing your application. For example I use the following in my page: <af:panelStretchLayout topHeight="70px" startWidth="0px" endWidth="0px"                                       dimensionsFrom="#{adfFacesContext.agent.capabilities['touchScreen'] eq 'none'  ? 'parent' : 'children' }"> Which results in a flow layout for iPads and a stretch layout for regular browsers. Check out the result in the below demo: &amp;lt;span id=&amp;quot;XinhaEditingPostion&amp;quot;&amp;gt;&amp;lt;/span&amp;gt;

    Read the article

  • ADF version of "Modern" dialog windows

    - by Martin Deh
    It is no surprise with the popularity of the i-devices (iphone, ipad), that many of the iOS UI based LnF (look and feel) would start to inspire web designers to incorporate the same LnF into their web sites.  Take for example, a normal dialog popup.  In the iOS world, the LnF becomes a bit more elegant by add just a simple element as a "floating" close button: In this blog post, I will describe how this can be accomplished using OOTB ADF components and CSS3 style elements. There are two ways that this can be achieved.  The easiest way is to simply replace the default image, which looks like this, and adjust the af|panelWindow:close-icon-style skin selector.   Using this simple technique, you can come up with this: The CSS code to produce this effect is pretty straight forward: af|panelWindow.test::close-icon-style{    background-image: url("../popClose.gif");    line-height: 10px;    position: absolute;    right: -10px;    top: -10px;    height:38px;    width:38px;    outline:none; } You can see from the CSS, the position of the region, which holds the image, is relocated based on the position based attributes.  Also, the addition of the "outline" attribute removes the border that is visible in Chrome and IE.  The second example, is based on not having an image to produce the close button.  Like the previous sample, I will use the OOTB panelWindow.  However, this time I will use a OOTB commandButton to replace the image.  The construct of the components looks like this: The commandButton is positioned first in the hierarchy making the re-positioning easier.  The commandButton will also need a style class assigned to it (i.e. closeButton), which will allow for the positioning and the over-riding of the default skin attributes of a default button.  In addition, the closeIconVisible property is set to false, since the default icon is no longer needed.  Once this is done, the rest is in the CSS.  Here is the sample that I created that was used for an actual customer POC: The CSS code for the button: af|commandButton.closeButton, af|commandButton.closeButton af|commandButton:text-only{     line-height: 10px;     position: absolute;     right: -10px;     top: -10px;     -webkit-border-radius: 70px;     -moz-border-radius: 70px;     -ms-border-radius: 70px;     border-radius: 70px;     background-image:none;     border:#828c95 1px solid;     background-color:black;     font-weight: bold;     text-align: center;     text-decoration: none;     color:white;     height:30px;     width:30px;     outline:none; } The CSS uses the border radius to create the round effect on the button (in IE 8, since border-radius is not supported, this will only work with some added code). Also, I add the box-shadow attribute to the panelWindow style class to give it a nice shadowing effect.

    Read the article

  • ADF Reusable Artefacts

    - by Arda Eralp
    Primary reusable ADF Business Component: Entity Objects (EOs) View Objects (VOs) Application Modules (AMs) Framework Extensions Classes Primary reusable ADF Controller: Bounded Task Flows (BTFs) Task Flow Templates Primary reusable ADF Faces: Page Templates Skins Declarative Components Utility Classes Certain components will often be used more than once. Whether the reuse happens within the same application, or across different applications, it is often advantageous to package these reusable components into a library that can be shared between different developers, across different teams, and even across departments within an organization. In the world of Java object-oriented programming, reusing classes and objects is just standard procedure. With the introduction of the model-view-controller (MVC) architecture, applications can be further modularized into separate model, view, and controller layers. By separating the data (model and business services layers) from the presentation (view and controller layers), you ensure that changes to any one layer do not affect the integrity of the other layers. You can change business logic without having to change the UI, or redesign the web pages or front end without having to recode domain logic. Oracle ADF and JDeveloper support the MVC design pattern. When you create an application in JDeveloper, you can choose many application templates that automatically set up data model and user interface projects. Because the different MVC layers are decoupled from each other, development can proceed on different projects in parallel and with a certain amount of independence. ADF Library further extends this modularity of design by providing a convenient and practical way to create, deploy, and reuse high-level components. When you first design your application, you design it with component reusability in mind. If you created components that can be reused, you can package them into JAR files and add them to a reusable component repository. If you need a component, you may look into the repository for those components and then add them into your project or application. For example, you can create an application module for a domain and package it to be used as the data model project in several different applications. Or, if your application will be consuming components, you may be able to load a page template component from a repository of ADF Library JARs to create common look and feel pages. Then you can put your page flow together by stringing together several task flow components pulled from the library. An ADF Library JAR contains ADF components and does not, and cannot, contain other JARs. It should not be confused with the JDeveloper library, Java EE library, or Oracle WebLogic shared library. Reusable Component Description Data Control Any data control can be packaged into an ADF Library JAR. Some of the data controls supported by Oracle ADF include application modules, Enterprise JavaBeans, web services, URL services, JavaBeans, and placeholder data controls. Application Module When you are using ADF Business Components and you generate an application module, an associated application module data control is also generated. When you package an application module data control, you also package up the ADF Business Components associated with that application module. The relevant entity objects, view objects, and associations will be a part of the ADF Library JAR and available for reuse. Business Components Business components are the entity objects, view objects, and associations used in the ADF Business Components data model project. You can package business components by themselves or together with an application module. Task Flows & Task Flow Templates Task flows can be packaged into an ADF Library JAR for reuse. If you drop a bounded task flow that uses page fragments, JDeveloper adds a region to the page and binds it to the dropped task flow. ADF bounded task flows built using pages can be dropped onto pages. The drop will create a link to call the bounded task flow. A task flow call activity and control flow will automatically be added to the task flow, with the view activity referencing the page. If there is more than one existing task flow with a view activity referencing the page, it will prompt you to select the one to automatically add a task flow call activity and control flow. If an ADF task flow template was created in the same project as the task flow, the ADF task flow template will be included in the ADF Library JAR and will be reusable. Page Templates You can package a page template and its artifacts into an ADF Library JAR. If the template uses image files and they are included in a directory within your project, these files will also be available for the template during reuse. Declarative Components You can create declarative components and package them for reuse. The tag libraries associated with the component will be included and loaded into the consuming project. You can also package up projects that have several different reusable components if you expect that more than one component will be consumed. For example, you can create a project that has both an application module and a bounded task flow. When this ADF Library JAR file is consumed, the application will have both the application module and the task flow available for use. You can package multiple components into one JAR file, or you can package a single component into a JAR file. Oracle ADF and JDeveloper give you the option and flexibility to create reusable components that best suit you and your organization. You create a reusable component by using JDeveloper to package and deploy the project that contains the components into a ADF Library JAR file. You use the components by adding that JAR to the consuming project. At design time, the JAR is added to the consuming project's class path and so is available for reuse. At runtime, the reused component runs from the JAR file by reference.

    Read the article

  • Clone a VirtualBox Machine

    I just installed VirtualBox, which I want to try out based on recommendations from peers for running a server from within my Windows 7 x64 OS.  Ive never used VirtualBox, so Im certainly no expert at it, but I did want to share my experience with it thus far.  Specifically, my intention is to create a couple of virtual machines.  One I intend to use as a build server, for which a virtual machine makes sense because I can easily move it around as needed if there are hardware issues (its worth noting my need for setting up a build server at the moment is a result of a disk failure on the old build server).  The other VM I want to set up will act as a proxy server for the issue tracking system were using at Code Project, Axosoft OnTime.  They have a Remote Server application for this purpose, and since the OnTime install is 300 miles away from my location, the Remote Server should speed up my use of the OnTime client by limiting the chattiness with the database (at least, thats the hope). So, I need two VMs, and Im lazy.  I dont want to have to install the OS and such twice.  No problem, it should be simple to clone a virtualbox machine, or clone a virtualbox hard drive, right?  Well unfortunately, if you look at the UI for VirtualBox, theres no such command.  Youre left wondering How do I clone a VirtualBox machine? or the slightly related How do I clone a VirtualBox hard drive? If youve used VirtualPC, then you know that its actually pretty easy to copy and move around those VMs.  Not quite so easy with VirtualBox.  Finding the files is easy, theyre located in your user folder within the .VirtualBox folder (possibly within a HardDisks folder).  The disks have a .vdi extension and will be pretty large if youve installed anything.  The one shown here has just Windows Server 2008 R2 installed on it nothing else. If you copy the .vdi file and rename it, you can use the Virtual Media Manager to view it and you can create a new machine and choose the new drive to attach to.  Unfortunately, if you simply make a copy of the drive, this wont work and youll get an error that says something to the effect of: Cannot register the hard disk PATH with UUID {id goes here} because a hard disk PATH2 with UUID {same id goes here} already exists in the media registry (PATH to XML file). There are command line tools you can use to do this in a way that avoids this error.  Specifically, the c:\Program File\Sun\VirtualBox\VBoxManage.exe program is used for all command line access to VirtualBox, and to copy a virtual disk (.vdi file) you would call something like this: VBoxManage clonehd Disk1.vdi Disk1_Copy.vdi However, in my case this didnt work.  I got basically the same error I showed above, along with some debug information for line 628 of VBoxManageDisk.cpp.  As my main task was not to debug the C++ code used to write VirtualBox, I continued looking for a simple way to clone a virtual drive.  I found it in this blog post. The Secret setvdiuuid Command VBoxManage has a whole bunch of commands you can use with it just pass it /? to see the list.  However, it also has a special command called internalcommands that opens up access to even more commands.  The one thats interesting for us here is the setvdiuuid command.  By calling this command and passing in the file path to your vdi file, it will reset the UUID to a new (random, apparently) UUID.  This then allows the virtual media manager to cope with the file, and lets you set up new machines that reference the newly UUIDd virtual drive.  The full command line would be: VBoxManage internalcommands setvdiuuid MyCopy.vdi The following screenshot shows the error when trying clonehd as well as the successful use of setvdiuuid. Summary Now that I can clone machines easily, its a simple matter to set up base builds of any OS I might need, and then fork from there as needed.  Hopefully the GUI for VirtualBox will be improved to include better support for copying machines/disks, as this is Im sure a very common scenario. Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • UK OUG Conference Highlights and Insights

    - by Richard Bingham
    As per my preemptive post, this was the first time the annual conference organized by the UK Oracle User Group (UKOUG) was split into two events, one for Oracle Applications and another in December for Oracle Technology. Apps13, as it was branded, was hailed as a success, with over 1000 registered attendees and three days of sessions, exhibition, round-tables and many other types of content. As this poster on their stand illustrates, the UKOUG is a strong community with popular participants from both big and small Oracle partners and customers. The venue was a more intimate setting than previous years also, allowing everyone to casually bump into those they hoped to. It gave a real feeling of an Apps Community. The main themes over the days where CRM and Customer Experience, HCM, and FIN/SCM. This allowed people to attend just one focused day if they wanted. In addition the Apps Transformation stream ran across all three days, offering insights, advice, and details on the newer product solutions like Fusion Applications.  Here are some of the key take-aways I got from the conference, specific to my role in Fusion Applications Developer Relations: User Experience continues to be a significant reason for adopting some of the newer application products available, with immediately obvious gains in user productivity and satisfaction reported by customers. Also this doesn't stop with the baked-in UX either, with their Design Patterns proving popular and indeed currently being extended to including things like extending on ADF mobile and customizing the Simplified UI. More on this to come from us soon. The executive sessions emphasized the "it's a journey" phrase, illustrating that modern business applications are powered by technologies such as Cloud, Mobile, Social and Big Data and these can be harnessed to help propel your organization forward. Indeed the emphasis is away from the traditional vendor prescribed linear applications road map, and towards plotting a course based on business priorities supported by a broad range of integrated solutions. To help with this several conference sessions demoed the new "Applications Navigator" tool, developed in partnership with OUG members, which offers a visual framework to help organizations plan their Oracle Applications investments around business and technology imperatives. Initial reaction was positive, especially as customers do not need to decipher Oracle's huge product catalog and embeds the best blend of proven and integrated applications solutions. We'll share more on this when it is generally available. Several sessions focused around explanations and interpretation of Oracle OpenWorld 2013, helping highlight the key Oracle Applications messages and directions. With a relative small percentage of conference attendees also at OpenWorld (from a show of hands) this was a popular way to distill the information available down into specific items of interest for the community. Please note the original OpenWorld 2013 content is still available for download but will not remain available forever (via the Oracle website OpenWorld Content Catalog > pick a session > see the PDF download). With the release of E-Business Suite 12.2 the move to develop and deploy on the Fusion Middleware stack becomes a reality for many Oracle Applications customers. This coupled with recent E-Business Suite features such as the Integrated SOA Gateway and the E-Business Suite SDK for Java, illustrates how the gap between the technologies and techniques involved in extending E-Business Suite and Fusion Applications is quickly narrowing. We'll see this merging continue to evolve going forwards. Getting started with Oracle Cloud Applications is actually easier than many customers expected, with a broad selection of both large and medium sized organizations explaining how they added new features to their existing Oracle Applications portfolios. New functionality available from Fusion HCM and CX are popular extensions that do not have to disrupt those core business services. Coexistence is the buzzword here, and the available integration is also simpler than many expected, commonly involving an initial setup data load, then regularly incremental synchronizations, often without a need for real-time constant communication between systems. With much of this pre-built already the implementation process is also quite rapid. With most people dressed in suits, we wanted to get the conversations going without the traditional english reserve, so we decided to make ourselves a bit more obvious, as the photo below shows. This seemed to be quite successful and helped those interested identify and approach us. Keep a look out for similar again. In fact if you're in the UK there is an "Apps Transformation Day" planned by the UKOUG for the 19th March 2014, with more details to follow. Again something we'll be sure to participate in. I am hoping to attend the next half of the UKOUG annual conference, Tech13, that focuses more on Oracle technology and where there is more likely to be larger attendance of those interested in the lower-level aspects of applications customization and development. If you're going, let me know and maybe we can meet up.

    Read the article

  • Building a Repository Pattern against an EF 5 EDMX Model - Part 1

    - by Juan
    I am part of a year long plus project that is re-writing an existing application for a client.  We have decided to develop the project using Visual Studio 2012 and .NET 4.5.  The project will be using a number of technologies and patterns to include Entity Framework 5, WCF Services, and WPF for the client UI.This is my attempt at documenting some of the successes and failures that I will be coming across in the development of the application.In building the data access layer we have to access a database that has already been designed by a dedicated dba. The dba insists on using Stored Procedures which has made the use of EF a little more difficult.  He will not allow direct table access but we did manage to get him to allow us to use Views.  Since EF 5 does not have good support to do Code First with Stored Procedures, my option was to create a model (EDMX) against the existing database views.   I then had to go select each entity and map the Insert/Update/Delete functions to their respective stored procedure. The next step after I had completed mapping the stored procedures to the entities in the EDMX model was to figure out how to build a generic repository that would work well with Entity Framework 5.  After reading the blog posts below, I adopted much of their code with some changes to allow for the use of Ninject for dependency injection.http://www.tcscblog.com/2012/06/22/entity-framework-generic-repository/ http://www.tugberkugurlu.com/archive/generic-repository-pattern-entity-framework-asp-net-mvc-and-unit-testing-triangle IRepository.cs public interface IRepository : IDisposable where T : class { void Add(T entity); void Update(T entity, int id); T GetById(object key); IQueryable Query(Expression> predicate); IQueryable GetAll(); int SaveChanges(); int SaveChanges(bool validateEntities); } GenericRepository.cs public abstract class GenericRepository : IRepository where T : class { public abstract void Add(T entity); public abstract void Update(T entity, int id); public abstract T GetById(object key); public abstract IQueryable Query(Expression> predicate); public abstract IQueryable GetAll(); public int SaveChanges() { return SaveChanges(true); } public abstract int SaveChanges(bool validateEntities); public abstract void Dispose(); } One of the issues I ran into was trying to do an update. I kept receiving errors so I posted a question on Stack Overflow http://stackoverflow.com/questions/12585664/an-object-with-the-same-key-already-exists-in-the-objectstatemanager-the-object and came up with the following hack. If someone has a better way, please let me know. DbContextRepository.cs public class DbContextRepository : GenericRepository where T : class { protected DbContext Context; protected DbSet DbSet; public DbContextRepository(DbContext context) { if (context == null) throw new ArgumentException("context"); Context = context; DbSet = Context.Set(); } public override void Add(T entity) { if (entity == null) throw new ArgumentException("Cannot add a null entity."); DbSet.Add(entity); } public override void Update(T entity, int id) { if (entity == null) throw new ArgumentException("Cannot update a null entity."); var entry = Context.Entry(entity); if (entry.State == EntityState.Detached) { var attachedEntity = DbSet.Find(id); // Need to have access to key if (attachedEntity != null) { var attachedEntry = Context.Entry(attachedEntity); attachedEntry.CurrentValues.SetValues(entity); } else { entry.State = EntityState.Modified; // This should attach entity } } } public override T GetById(object key) { return DbSet.Find(key); } public override IQueryable Query(Expression> predicate) { return DbSet.Where(predicate); } public override IQueryable GetAll() { return Context.Set(); } public override int SaveChanges(bool validateEntities) { Context.Configuration.ValidateOnSaveEnabled = validateEntities; return Context.SaveChanges(); } #region IDisposable implementation public override void Dispose() { if (Context != null) { Context.Dispose(); GC.SuppressFinalize(this); } } #endregion IDisposable implementation } At this point I am able to start creating individual repositories that are needed and add a Unit of Work.  Stay tuned for the next installment in my path to creating a Repository Pattern against EF5.

    Read the article

  • Adaptive Connections For ADFBC

    - by Duncan Mills
    Some time ago I wrote an article on Adaptive Bindings showing how the pageDef for a an ADF UI does not have to be wedded to a fixed data control or collection / View Object. This article has proved pretty popular, so as a follow up I wanted to cover another "Adaptive" feature of your ADF applications, the ability to make multiple different connections from an Application Module, at runtime. Now, I'm sure you'll be aware that if you define your application to use a data-source rather than a hard-coded JDBC connection string, then you have the ability to change the target of that data-source after deployment to point to a different database. So that's great, but the reality of that is that this single connection is effectively fixed within the application right?  Well no, this it turns out is a common misconception. To be clear, yes a single instance of an ADF Application Module is associated with a single connection but there is nothing to stop you from creating multiple instances of the same Application Module within the application, all pointing at different connections.  If fact this has been possible for a long time using a custom extension point with code that which extends oracle.jbo.http.HttpSessionCookieFactory. This approach, however, involves writing code and no-one likes to write any more code than they need to, so, is there an easier way? Yes indeed.  It is in fact  a little publicized feature that's available in all versions of 11g, the ELEnvInfoProvider. What Does it Do?  The ELEnvInfoProvider  is  a pre-existing class (the full path is  oracle.jbo.client.ELEnvInfoProvider) which you can plug into your ApplicationModule configuration using the jbo.envinfoprovider property. Visuallty you can set this in the editor, or you can also set it directly in the bc4j.xcfg (see below for an example) . Once you have plugged in this envinfoprovider, here's the fun bit, rather than defining the hard-coded name of a datasource instead you can plug in a EL expression for the connection to use.  So what's the benefit of that? Well it allows you to defer the selection of a connection until the point in time that you instantiate the AM. To define the expression itself you'll need to do a couple of things: First of all you'll need a managed bean of some sort – e.g. a sessionScoped bean defined in your ViewController project. This will need a getter method that returns the name of the connection. Now this connection itself needs to be defined in your Application Server, and can be managed through Enterprise Manager, WLST or through MBeans. (You may need to read the documentation [http://docs.oracle.com/cd/E28280_01/web.1111/b31974/deployment_topics.htm#CHDJGBDD] here on how to configure connections at runtime if you're not familiar with this)   The EL expression (e.g. ${connectionManager.connection} is then defined in the configuration by editing the bc4j.xcfg file (there is a hyperlink directly to this file on the configuration editing screen in the Application Module editor). You simply replace the hardcoded JDBCName value with the expression.  So your cfg file would end up looking something like this (notice the reference to the ELEnvInfoProvider that I talked about earlier) <BC4JConfig version="11.1" xmlns="http://xmlns.oracle.com/bc4j/configuration">   <AppModuleConfigBag ApplicationName="oracle.demo.model.TargetAppModule">   <AppModuleConfig DeployPlatform="LOCAL"  JDBCName="${connectionManager.connection}" jbo.project="oracle.demo.model.Model" name="TargetAppModuleLocal" ApplicationName="oracle.demo.model.TargetAppModule"> <AM-Pooling jbo.doconnectionpooling="true"/> <Database jbo.locking.mode="optimistic">       <Security AppModuleJndiName="oracle.demo.model.TargetAppModule"/>    <Custom jbo.envinfoprovider="oracle.jbo.client.ELEnvInfoProvider"/> </AppModuleConfig> </AppModuleConfigBag> </BC4JConfig> Still Don't Quite Get It? So far you might be thinking, well that's fine but what difference does it make if the connection is resolved "just in time" rather than up front and changed as required through Enterprise Manager? Well a trivial example would be where you have a single application deployed to your application server, but for different users you want to connect to different databases. Because, the evaluation of the connection is deferred until you first reference the AM you have a decision point that can take the user identity into account. However, think about it for a second.  Under what circumstances does a new AM get instantiated? Well at the first reference of the AM within the application yes, but also whenever a Task Flow is entered -  if the data control scope for the Task Flow is ISOLATED.  So the reality is, that on a single screen you can embed multiple Task Flows, all of which are pointing at different database connections concurrently. Hopefully you'll find this feature useful, let me know... 

    Read the article

  • client problems - misaligned expectations & not following SDLC protocols

    - by louism
    hi guys, im having some serious problems with a client on a project - i could use some advice please the short version i have been working with this client now for almost 6 months without any problems (a classified website project in the range of 500 hours) over the last few days things have drastically deteriorated to the point where ive had to place the project on-hold whilst i work-out what to do (this has pissed the client off even more) to be simplistic, the root cause of the issue is this: the client doesnt read the specs i make for him, i code the feature, he than wants to change things, i tell him its not to the agreed spec and that that change will have to be postponed and possibly charged for, he gets upset and rants saying 'hes paid for the feature' and im not keeping to the agreement (<- misalignment of expectations) i think the root cause of the root cause is my clients failure to take my SDLC protocols seriously. i have a bug tracking system in place which he practically refuses to use (he still emails me bugs), he doesnt seem to care to much for the protocols i use for dealing with scope creep and change control the whole situation came to a head recently where he 'cracked it' (an aussie term for being fed-up). the more terms like 'postponed for post-launch implementation', 'costed feature addition', and 'not to agreed spec' i kept using, the worse it got finally, he began to bully me - basically insisting i shut-up and do the work im being paid for. i wrote a long-winded email explaining how wrong he was on all these different points, and explaining what all the SDLC protocols do to protect the success of the project. than i deleted that email and wrote a new one in the new email, i suggested as a solution i write up a list of grievances we both had. we than review the list and compromise on different points: he gets some things he wants, i get some things i want. sometimes youve got to give ground to get ground his response to this suggestion was flat-out refusal, and a restatement that i should just get on with the work ive been paid to do so there you have the very subjective short version. if you have the time and inclination, the long version may be a little less bias as it has the email communiques between me and my client the long version (with background) the long version works by me showing you the email communiques which lead to the situation coming to a head. so here it is, judge for yourself where the trouble started... 1. client asked me why something was missing from a feature i just uploaded, my response was to show him what was in the spec: it basically said the item he was looking for was never going to be included 2. [clients response...] Memo Louis, We are following your own title fields and keeping a consistent layout. Why the big fuss about not adding "Part". It simply replaces "model" and is consistent with your current title fields. 3. [my response...] hi [client], the 'part' field appeared to me as a redundancy / mistake. i requested clarification but never received any in a timely manner (about 2 weeks ago) the specification for this feature also indicated it wasnt going to be included: RE: "Why the big fuss about not adding "Part" " it may not appear so, but it would actually be a lot of work for me to now add a 'Part' field it could take me up to 15-20 minutes to properly explain why its such a big undertaking to do this, but i would prefer to use that time instead to work on completing your v1.1 features as a simplistic explanation - it connects to the change in paradigm from a 'generic classified ad' model to a 'specific attributes for specific categories' model basically, i am saying it is a big fuss, but i understand that it doesnt look that way - after all, it is just one ity-bitty field :) if you require a fuller explanation, please let me know and i will commit the time needed to write that out also, if you recall when we first started on the project, i said that with the effort/time required for features, you would likely not know off the top of your head. you may think something is really complex, but in reality its quite simple, you might think something is easy - but it could actually be a massive trauma to code (which is the case here with the 'Part' field). if you also recalled, i said the best course of action is to just ask, and i would let you know on a case-by-case basis 4. [email from me to client...] hi [client], the online catalogue page is now up live (see my email from a few days ago for information on how it works) note: the window of opportunity for input/revisions on what data the catalogue stores has now closed (as i have put the code up live now) RE: the UI/layout of the online catalogue page you may still do visual/ui tweaks to the page at the moment (this window for input/revisions will close in a couple of days time) 5. [email from client to me...] *(note: i had put up the feature & asked the client to review it, never heard back from them for a few days)* Memo Louis, Here you go again. CLOSED without a word of input from the customer. I don't think so. I will reply tomorrow regarding the content and functionality we require from this feature. 5. [from me to client...] hi [client]: RE: from my understanding, you are saying that the mini-sale yard control would change itself based on the fact someone was viewing for parts & accessories <- is that correct? this change is outside the scope of the v1.1 mini-spec and therefore will need to wait 'til post launch for costing/implementation 6. [email from client to me...] Memo Louis, Following your v1.1 mini-spec and all your time paid in full for the work selected. We need to make the situation clear. There will be no further items held for post-launch. Do not expect us to pay for any further items other than those we have agreed upon. You have undertaken to complete the Parts and accessories feature as follows. Obviously, as part of this process the "mini search" will be effected, and will require "adaption to make sense". 7. [email from me to client...] hi [client], RE: "There will be no further items held for post-launch. Do not expect us to pay for any further items other than those we have agreed upon." a few points to consider: 1) the specification for the 'parts & accessories' feature was as follows: (i.e. [what] "...we have agreed upon.") 2) you have received the 'parts & accessories' feature free of charge (you have paid $0 for it). ive spent two days coding that feature as a gesture of good will i would request that you please consider these two facts carefully and sincerely 8. [email from client to me...] Memo Louis, I don't see how you are giving us anything for free. From your original fee proposal you have deleted more than 30 hours of included features. Your title "shelved features". Further you have charged us twice by adding back into the site, at an addition cost, some of those "shelved features" features. See v1.1 mini-spec. Did include in your original fee proposal a change request budget but then charge without discussion items included in v1.1 mini-spec. Included a further Features test plan for a regression test, a fee of 10 hours that would not have been required if the "shelved features" were not left out of the agreed fee proposal. I have made every attempt to satisfy your your uneven business sense by offering you everything your heart desired, in the v1.1 mini-spec, to be left once again with your attitude of "its too hard, lets leave it for post launch". I am no longer accepting anything less than what we have contracted you to do. That is clearly defined in v1.1 mini-spec, and you are paid in advance for delivering those items as an acceptable function. a few notes about the above email... i had to cull features from the original spec because it didnt fit into the budget. i explained this to the client at the start of the project (he wanted more features than he had budget hours to do them all) nothing has been charged for twice, i didnt charge the client for culled features. im charging him to now do those culled features the draft version of the project schedule included a change request budget of 10 hours, but i had to remove that to meet the budget (the client may not have been aware of this to be fair to them) what the client refers to as my attitude of 'too hard/leave it for post-launch', i called a change request protocol and a method for keeping scope creep under control 9. [email from me to client...] hi [client], RE: "...all your grievances..." i had originally written out a long email response; it was fantastic, it had all these great points of how 'you were wrong' and 'i was right', you would of loved it (and by 'loved it', i mean it would of just infuriated you more) so, i decided to deleted it start over, for two reasons: 1) a long email is being disrespectful of your time (youre a busy businessman with things to do) 2) whos wrong or right gets us no closer to fixing the problems we are experiencing what i propose is this... i prepare a bullet point list of your grievances and my grievances (yes, im unhappy too about how things are going - and it has little to do with money) i submit this list to you for you to add to as necessary we then both take a good hard look at this list, and we decide which areas we are willing to give ground on as an example, the list may look something like this: "louis, you keep taking away features you said you would do" [your grievance 2] [your grievance 3] [your grievance ...] "[client], i feel you dont properly read the specs i prepare for you..." [my grievance 2] [my grievance 3] [my grievance ...] if you are willing to give this a try, let me know will it work? who knows. but if it doesnt, we can always go back to arguing some more :) obviously, this will only work if you are willing to give it a genuine try, and you can accept that you may have to 'give some ground to get some ground' what do you think? 10. [email from client to me ...] Memo Louis, Instead of wasting your time listing grievances, I would prefer you complete the items in v1.1 mini-spec, to a satisfactory conclusion. We almost had the website ready for launch until you brought the v1.1 mini-spec into the frame. Obviously I expected you could complete the v1.1 mini-spec in a two-week time frame as you indicated and give the site a more profession presentation. Most of the problems have been caused by you not following our instructions, but deciding to do what you feel like at the time. And then arguing with us how the missing information is not necessary. For instance "Parts and Accessories". Why on earth would you leave out the parts heading, when it ties-in with the fields you have already developed. It replaces "model" and is just as important in the context of information that appears in the "Details" panel. We are at a stage where the the v1.1 mini-spec needs to be completed without further time wasting and the site is complete (subject to all features working). We are on standby at this end to do just that. Let me know when you are back, working on the site and we will process and complete each v1.1 mini-spec, item by item, until the job is complete. 11. [last email from me to client...] hi [client], based on this reply, and your demonstrated unwillingness to compromise/give any ground on issues at hand, i have decided to place your project on-hold for the moment i will be considering further options on how to over-come our challenges over the next few days i will contact you by monday 17/may to discuss any new options i have come up with, and if i believe it is appropriate to restart work on your project at that point or not told you it was long... what do you think?

    Read the article

  • JGoodies HashMap

    - by JohnMcClane
    Hi, I'm trying to build a chart program using presentation model. Using JGoodies for data binding was relatively easy for simple types like strings or numbers. But I can't figure out how to use it on a hashmap. I'll try to explain how the chart works and what my problem is: A chart consists of DataSeries, a DataSeries consists of DataPoints. I want to have a data model and to be able to use different views on the same model (e.g. bar chart, pie chart,...). Each of them consists of three classes. For example: DataPointModel: holds the data model (value, label, category) DataPointViewModel: extends JGoodies PresentationModel. wraps around DataPointModel and holds view properties like font and color. DataPoint: abstract class, extends JComponent. Different Views must subclass and implement their own ui. Binding and creating the data model was easy, but i don't know how to bind my data series model. package at.onscreen.chart; import java.beans.PropertyChangeListener; import java.beans.PropertyChangeSupport; import java.beans.PropertyVetoException; import java.util.Collection; import java.util.HashMap; import java.util.Iterator; public class DataSeriesModel { public static String PROPERTY_DATAPOINT = "dataPoint"; public static String PROPERTY_DATAPOINTS = "dataPoints"; public static String PROPERTY_LABEL = "label"; public static String PROPERTY_MAXVALUE = "maxValue"; /** * holds the data points */ private HashMap dataPoints; /** * the label for the data series */ private String label; /** * the maximum data point value */ private Double maxValue; /** * the model supports property change notification */ private PropertyChangeSupport propertyChangeSupport; /** * default constructor */ public DataSeriesModel() { this.maxValue = Double.valueOf(0); this.dataPoints = new HashMap(); this.propertyChangeSupport = new PropertyChangeSupport(this); } /** * constructor * @param label - the series label */ public DataSeriesModel(String label) { this.dataPoints = new HashMap(); this.maxValue = Double.valueOf(0); this.label = label; this.propertyChangeSupport = new PropertyChangeSupport(this); } /** * full constructor * @param label - the series label * @param dataPoints - an array of data points */ public DataSeriesModel(String label, DataPoint[] dataPoints) { this.dataPoints = new HashMap(); this.propertyChangeSupport = new PropertyChangeSupport(this); this.maxValue = Double.valueOf(0); this.label = label; for (int i = 0; i < dataPoints.length; i++) { this.addDataPoint(dataPoints[i]); } } /** * full constructor * @param label - the series label * @param dataPoints - a collection of data points */ public DataSeriesModel(String label, Collection dataPoints) { this.dataPoints = new HashMap(); this.propertyChangeSupport = new PropertyChangeSupport(this); this.maxValue = Double.valueOf(0); this.label = label; for (Iterator it = dataPoints.iterator(); it.hasNext();) { this.addDataPoint(it.next()); } } /** * adds a new data point to the series. if the series contains a data point with same id, it will be replaced by the new one. * @param dataPoint - the data point */ public void addDataPoint(DataPoint dataPoint) { String category = dataPoint.getCategory(); DataPoint oldDataPoint = this.getDataPoint(category); this.dataPoints.put(category, dataPoint); this.setMaxValue(Math.max(this.maxValue, dataPoint.getValue())); this.propertyChangeSupport.firePropertyChange(PROPERTY_DATAPOINT, oldDataPoint, dataPoint); } /** * returns the data point with given id or null if not found * @param uid - the id of the data point * @return the data point or null if there is no such point in the table */ public DataPoint getDataPoint(String category) { return this.dataPoints.get(category); } /** * removes the data point with given id from the series, if present * @param category - the data point to remove */ public void removeDataPoint(String category) { DataPoint dataPoint = this.getDataPoint(category); this.dataPoints.remove(category); if (dataPoint != null) { if (dataPoint.getValue() == this.getMaxValue()) { Double maxValue = Double.valueOf(0); for (Iterator it = this.iterator(); it.hasNext();) { DataPoint itDataPoint = it.next(); maxValue = Math.max(itDataPoint.getValue(), maxValue); } this.setMaxValue(maxValue); } } this.propertyChangeSupport.firePropertyChange(PROPERTY_DATAPOINT, dataPoint, null); } /** * removes all data points from the series * @throws PropertyVetoException */ public void removeAll() { this.setMaxValue(Double.valueOf(0)); this.dataPoints.clear(); this.propertyChangeSupport.firePropertyChange(PROPERTY_DATAPOINTS, this.getDataPoints(), null); } /** * returns the maximum of all data point values * @return the maximum of all data points */ public Double getMaxValue() { return this.maxValue; } /** * sets the max value * @param maxValue - the max value */ protected void setMaxValue(Double maxValue) { Double oldMaxValue = this.getMaxValue(); this.maxValue = maxValue; this.propertyChangeSupport.firePropertyChange(PROPERTY_MAXVALUE, oldMaxValue, maxValue); } /** * returns true if there is a data point with given category * @param category - the data point category * @return true if there is a data point with given category, otherwise false */ public boolean contains(String category) { return this.dataPoints.containsKey(category); } /** * returns the label for the series * @return the label for the series */ public String getLabel() { return this.label; } /** * returns an iterator over the data points * @return an iterator over the data points */ public Iterator iterator() { return this.dataPoints.values().iterator(); } /** * returns a collection of the data points. the collection supports removal, but does not support adding of data points. * @return a collection of data points */ public Collection getDataPoints() { return this.dataPoints.values(); } /** * returns the number of data points in the series * @return the number of data points */ public int getSize() { return this.dataPoints.size(); } /** * adds a PropertyChangeListener * @param listener - the listener */ public void addPropertyChangeListener(PropertyChangeListener listener) { this.propertyChangeSupport.addPropertyChangeListener(listener); } /** * removes a PropertyChangeListener * @param listener - the listener */ public void removePropertyChangeListener(PropertyChangeListener listener) { this.propertyChangeSupport.removePropertyChangeListener(listener); } } package at.onscreen.chart; import java.beans.PropertyVetoException; import java.util.Collection; import java.util.Iterator; import com.jgoodies.binding.PresentationModel; public class DataSeriesViewModel extends PresentationModel { /** * default constructor */ public DataSeriesViewModel() { super(new DataSeriesModel()); } /** * constructor * @param label - the series label */ public DataSeriesViewModel(String label) { super(new DataSeriesModel(label)); } /** * full constructor * @param label - the series label * @param dataPoints - an array of data points */ public DataSeriesViewModel(String label, DataPoint[] dataPoints) { super(new DataSeriesModel(label, dataPoints)); } /** * full constructor * @param label - the series label * @param dataPoints - a collection of data points */ public DataSeriesViewModel(String label, Collection dataPoints) { super(new DataSeriesModel(label, dataPoints)); } /** * full constructor * @param model - the data series model */ public DataSeriesViewModel(DataSeriesModel model) { super(model); } /** * adds a data point to the series * @param dataPoint - the data point */ public void addDataPoint(DataPoint dataPoint) { this.getBean().addDataPoint(dataPoint); } /** * returns true if there is a data point with given category * @param category - the data point category * @return true if there is a data point with given category, otherwise false */ public boolean contains(String category) { return this.getBean().contains(category); } /** * returns the data point with given id or null if not found * @param uid - the id of the data point * @return the data point or null if there is no such point in the table */ public DataPoint getDataPoint(String category) { return this.getBean().getDataPoint(category); } /** * returns a collection of the data points. the collection supports removal, but does not support adding of data points. * @return a collection of data points */ public Collection getDataPoints() { return this.getBean().getDataPoints(); } /** * returns the label for the series * @return the label for the series */ public String getLabel() { return this.getBean().getLabel(); } /** * sets the max value * @param maxValue - the max value */ public Double getMaxValue() { return this.getBean().getMaxValue(); } /** * returns the number of data points in the series * @return the number of data points */ public int getSize() { return this.getBean().getSize(); } /** * returns an iterator over the data points * @return an iterator over the data points */ public Iterator iterator() { return this.getBean().iterator(); } /** * removes all data points from the series * @throws PropertyVetoException */ public void removeAll() { this.getBean().removeAll(); } /** * removes the data point with given id from the series, if present * @param category - the data point to remove */ public void removeDataPoint(String category) { this.getBean().removeDataPoint(category); } } package at.onscreen.chart; import java.beans.PropertyChangeEvent; import java.beans.PropertyChangeListener; import java.beans.PropertyVetoException; import java.util.Collection; import java.util.Iterator; import javax.swing.JComponent; public abstract class DataSeries extends JComponent implements PropertyChangeListener { /** * the model */ private DataSeriesViewModel model; /** * default constructor */ public DataSeries() { this.model = new DataSeriesViewModel(); this.model.addPropertyChangeListener(this); this.createComponents(); } /** * constructor * @param label - the series label */ public DataSeries(String label) { this.model = new DataSeriesViewModel(label); this.model.addPropertyChangeListener(this); this.createComponents(); } /** * full constructor * @param label - the series label * @param dataPoints - an array of data points */ public DataSeries(String label, DataPoint[] dataPoints) { this.model = new DataSeriesViewModel(label, dataPoints); this.model.addPropertyChangeListener(this); this.createComponents(); } /** * full constructor * @param label - the series label * @param dataPoints - a collection of data points */ public DataSeries(String label, Collection dataPoints) { this.model = new DataSeriesViewModel(label, dataPoints); this.model.addPropertyChangeListener(this); this.createComponents(); } /** * full constructor * @param model - the model */ public DataSeries(DataSeriesViewModel model) { this.model = model; this.model.addPropertyChangeListener(this); this.createComponents(); } /** * creates, binds and configures UI components. * data point properties can be created here as components or be painted in paintComponent. */ protected abstract void createComponents(); @Override public void propertyChange(PropertyChangeEvent evt) { this.repaint(); } /** * adds a data point to the series * @param dataPoint - the data point */ public void addDataPoint(DataPoint dataPoint) { this.model.addDataPoint(dataPoint); } /** * returns true if there is a data point with given category * @param category - the data point category * @return true if there is a data point with given category, otherwise false */ public boolean contains(String category) { return this.model.contains(category); } /** * returns the data point with given id or null if not found * @param uid - the id of the data point * @return the data point or null if there is no such point in the table */ public DataPoint getDataPoint(String category) { return this.model.getDataPoint(category); } /** * returns a collection of the data points. the collection supports removal, but does not support adding of data points. * @return a collection of data points */ public Collection getDataPoints() { return this.model.getDataPoints(); } /** * returns the label for the series * @return the label for the series */ public String getLabel() { return this.model.getLabel(); } /** * sets the max value * @param maxValue - the max value */ public Double getMaxValue() { return this.model.getMaxValue(); } /** * returns the number of data points in the series * @return the number of data points */ public int getDataPointCount() { return this.model.getSize(); } /** * returns an iterator over the data points * @return an iterator over the data points */ public Iterator iterator() { return this.model.iterator(); } /** * removes all data points from the series * @throws PropertyVetoException */ public void removeAll() { this.model.removeAll(); } /** * removes the data point with given id from the series, if present * @param category - the data point to remove */ public void removeDataPoint(String category) { this.model.removeDataPoint(category); } /** * returns the data series view model * @return - the data series view model */ public DataSeriesViewModel getViewModel() { return this.model; } /** * returns the data series model * @return - the data series model */ public DataSeriesModel getModel() { return this.model.getBean(); } } package at.onscreen.chart.builder; import java.util.Collection; import net.miginfocom.swing.MigLayout; import at.onscreen.chart.DataPoint; import at.onscreen.chart.DataSeries; import at.onscreen.chart.DataSeriesViewModel; public class BuilderDataSeries extends DataSeries { /** * default constructor */ public BuilderDataSeries() { super(); } /** * constructor * @param label - the series label */ public BuilderDataSeries(String label) { super(label); } /** * full constructor * @param label - the series label * @param dataPoints - an array of data points */ public BuilderDataSeries(String label, DataPoint[] dataPoints) { super(label, dataPoints); } /** * full constructor * @param label - the series label * @param dataPoints - a collection of data points */ public BuilderDataSeries(String label, Collection dataPoints) { super(label, dataPoints); } /** * full constructor * @param model - the model */ public BuilderDataSeries(DataSeriesViewModel model) { super(model); } @Override protected void createComponents() { this.setLayout(new MigLayout()); /* * * I want to add a new BuilderDataPoint for each data point in the model. * I want the BuilderDataPoints to be synchronized with the model. * e.g. when a data point is removed from the model, the BuilderDataPoint shall be removed * from the BuilderDataSeries * */ } } package at.onscreen.chart.builder; import javax.swing.JFormattedTextField; import javax.swing.JTextField; import at.onscreen.chart.DataPoint; import at.onscreen.chart.DataPointModel; import at.onscreen.chart.DataPointViewModel; import at.onscreen.chart.ValueFormat; import com.jgoodies.binding.adapter.BasicComponentFactory; import com.jgoodies.binding.beans.BeanAdapter; public class BuilderDataPoint extends DataPoint { /** * default constructor */ public BuilderDataPoint() { super(); } /** * constructor * @param category - the category */ public BuilderDataPoint(String category) { super(category); } /** * constructor * @param value - the value * @param label - the label * @param category - the category */ public BuilderDataPoint(Double value, String label, String category) { super(value, label, category); } /** * full constructor * @param model - the model */ public BuilderDataPoint(DataPointViewModel model) { super(model); } @Override protected void createComponents() { BeanAdapter beanAdapter = new BeanAdapter(this.getModel(), true); ValueFormat format = new ValueFormat(); JFormattedTextField value = BasicComponentFactory.createFormattedTextField(beanAdapter.getValueModel(DataPointModel.PROPERTY_VALUE), format); this.add(value, "w 80, growx, wrap"); JTextField label = BasicComponentFactory.createTextField(beanAdapter.getValueModel(DataPointModel.PROPERTY_LABEL)); this.add(label, "growx, wrap"); JTextField category = BasicComponentFactory.createTextField(beanAdapter.getValueModel(DataPointModel.PROPERTY_CATEGORY)); this.add(category, "growx, wrap"); } } To sum it up: I need to know how to bind a hash map property to JComponent.components property. JGoodies is in my opinion not very well documented, I spent a long time searching through the internet, but I did not find any solution to my problem. Hope you can help me.

    Read the article

  • .NET Winform AJAX Login Services

    - by AdamSane
    I am working on a Windows Form that connects to a ASP.NET membership database and I am trying to use the AJAX Login Service. No matter what I do I keep on getting 404 errors on the Authentication_JSON_AppService.axd call. Web Config Below <?xml version="1.0"?> <!-- Note: As an alternative to hand editing this file you can use the web admin tool to configure settings for your application. Use the Website->Asp.Net Configuration option in Visual Studio. A full list of settings and comments can be found in machine.config.comments usually located in \Windows\Microsoft.Net\Framework\v2.x\Config --> <configuration> <configSections> <sectionGroup name="system.web.extensions" type="System.Web.Configuration.SystemWebExtensionsSectionGroup, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"> <sectionGroup name="scripting" type="System.Web.Configuration.ScriptingSectionGroup, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"> <section name="scriptResourceHandler" type="System.Web.Configuration.ScriptingScriptResourceHandlerSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="MachineToApplication"/> <sectionGroup name="webServices" type="System.Web.Configuration.ScriptingWebServicesSectionGroup, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"> <section name="jsonSerialization" type="System.Web.Configuration.ScriptingJsonSerializationSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="Everywhere"/> <section name="profileService" type="System.Web.Configuration.ScriptingProfileServiceSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="MachineToApplication"/> <section name="authenticationService" type="System.Web.Configuration.ScriptingAuthenticationServiceSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="MachineToApplication"/> <section name="roleService" type="System.Web.Configuration.ScriptingRoleServiceSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="MachineToApplication"/> </sectionGroup> </sectionGroup> </sectionGroup> </configSections> <connectionStrings <!-- Removed --> /> <appSettings/> <system.web> <!-- Set compilation debug="true" to insert debugging symbols into the compiled page. Because this affects performance, set this value to true only during development. --> <compilation debug="true"> <assemblies> <add assembly="System.Core, Version=3.5.0.0, Culture=neutral, PublicKeyToken=B77A5C561934E089"/> <add assembly="System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> <add assembly="System.Xml.Linq, Version=3.5.0.0, Culture=neutral, PublicKeyToken=B77A5C561934E089"/> <add assembly="System.Data.DataSetExtensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=B77A5C561934E089"/> <add assembly="System.Data.Linq, Version=3.5.0.0, Culture=neutral, PublicKeyToken=B77A5C561934E089"/> </assemblies> </compilation> <membership defaultProvider="dbSqlMembershipProvider"> <providers> <add name="dbSqlMembershipProvider" type="System.Web.Security.SqlMembershipProvider, System.Web, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" connectionStringName="Fire.Common.Properties.Settings.dbFireConnectionString" enablePasswordRetrieval="false" enablePasswordReset="true" requiresQuestionAndAnswer="true" applicationName="/" requiresUniqueEmail="false" passwordFormat="Hashed" maxInvalidPasswordAttempts="5" minRequiredPasswordLength="7" minRequiredNonalphanumericCharacters="1" passwordAttemptWindow="10" passwordStrengthRegularExpression=""/> </providers> </membership> <roleManager enabled="true" defaultProvider="dbSqlRoleProvider"> <providers> <add connectionStringName="Fire.Common.Properties.Settings.dbFireConnectionString" applicationName="/" name="dbSqlRoleProvider" type="System.Web.Security.SqlRoleProvider, System.Web, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"/> </providers> </roleManager> <authentication mode="Forms"> <forms loginUrl="Login.aspx" cookieless="UseCookies" protection="All" timeout="30" requireSSL="false" slidingExpiration="true" defaultUrl="default.aspx" enableCrossAppRedirects="false"/> </authentication> <authorization> <allow users="*"/> <allow users="?"/> </authorization> <customErrors mode="Off"> </customErrors> <pages> <controls> <add tagPrefix="asp" namespace="System.Web.UI" assembly="System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> <add tagPrefix="asp" namespace="System.Web.UI.WebControls" assembly="System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> </controls> </pages> <httpHandlers> <remove verb="*" path="*.asmx"/> <add verb="*" path="*_AppService.axd" validate="false" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> <add verb="GET,HEAD" path="ScriptResource.axd" validate="false" type="System.Web.Handlers.ScriptResourceHandler, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> <add verb="*" path="*.asmx" validate="false" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> </httpHandlers> <httpModules> <add name="ScriptModule" type="System.Web.Handlers.ScriptModule, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> </httpModules> </system.web> <location path="~/Admin"> <system.web> <authorization> <allow roles="Admin"/> <allow roles="System"/> <deny users="*"/> </authorization> </system.web> </location> <location path="~/Admin/System"> <system.web> <authorization> <allow roles="System"/> <deny users="*"/> </authorization> </system.web> </location> <location path="~/Export"> <system.web> <authorization> <allow roles="Export"/> <deny users="*"/> </authorization> </system.web> </location> <location path="~/Field"> <system.web> <authorization> <allow roles="Field"/> <deny users="*"/> </authorization> </system.web> </location> <location path="~/Default.aspx"> <system.web> <authorization> <allow roles="Admin"/> <allow roles="System"/> <allow roles="Export"/> <allow roles="Field"/> <deny users="?"/> </authorization> </system.web> </location> <location path="~/Login.aspx"> <system.web> <authorization> <allow users="*"/> <allow users="?"/> </authorization> </system.web> </location> <location path="~/App_Themes"> <system.web> <authorization> <allow users="*"/> <allow users="?"/> </authorization> </system.web> </location> <location path="~/Includes"> <system.web> <authorization> <allow users="*"/> <allow users="?"/> </authorization> </system.web> </location> <location path="~/WebServices"> <system.web> <authorization> <allow users="*"/> <allow users="?"/> </authorization> </system.web> </location> <location path="~/Authentication_JSON_AppService.axd"> <system.web> <authorization> <allow users="*"/> <allow users="?"/> </authorization> </system.web> </location> <system.codedom> <compilers> <compiler language="c#;cs;csharp" extension=".cs" type="Microsoft.CSharp.CSharpCodeProvider,System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" warningLevel="4"> <providerOption name="CompilerVersion" value="v3.5"/> <providerOption name="WarnAsError" value="false"/> </compiler> <compiler language="vb;vbs;visualbasic;vbscript" extension=".vb" type="Microsoft.VisualBasic.VBCodeProvider, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" warningLevel="4"> <providerOption name="CompilerVersion" value="v3.5"/> <providerOption name="OptionInfer" value="true"/> <providerOption name="WarnAsError" value="false"/> </compiler> </compilers> </system.codedom> <!-- The system.webServer section is required for running ASP.NET AJAX under Internet Information Services 7.0. It is not necessary for previous version of IIS. --> <system.webServer> <validation validateIntegratedModeConfiguration="false"/> <modules> <remove name="ScriptModule"/> <add name="ScriptModule" preCondition="managedHandler" type="System.Web.Handlers.ScriptModule, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> </modules> <handlers> <remove name="WebServiceHandlerFactory-Integrated"/> <remove name="ScriptHandlerFactory"/> <remove name="ScriptHandlerFactoryAppServices"/> <remove name="ScriptResource"/> <add name="ScriptHandlerFactory" verb="*" path="*.asmx" preCondition="integratedMode" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> <add name="ScriptHandlerFactoryAppServices" verb="*" path="*_AppService.axd" preCondition="integratedMode" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> <add name="ScriptResource" verb="GET,HEAD" path="ScriptResource.axd" preCondition="integratedMode" type="System.Web.Handlers.ScriptResourceHandler, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> </handlers> </system.webServer> <runtime> <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1"> <dependentAssembly> <assemblyIdentity name="System.Web.Extensions" publicKeyToken="31bf3856ad364e35"/> <bindingRedirect oldVersion="1.0.0.0-1.1.0.0" newVersion="3.5.0.0"/> </dependentAssembly> <dependentAssembly> <assemblyIdentity name="System.Web.Extensions.Design" publicKeyToken="31bf3856ad364e35"/> <bindingRedirect oldVersion="1.0.0.0-1.1.0.0" newVersion="3.5.0.0"/> </dependentAssembly> </assemblyBinding> </runtime> </configuration>

    Read the article

  • ASP.Net MVC 2 DropDownListFor in EditorTemplate

    - by tschreck
    I have a view model that looks like this: namespace AutoForm.Models { public class ProductViewModel { [UIHint("DropDownList")] public String Category { get; set; } [ScaffoldColumn(false)] public IEnumerable<SelectListItem> CategoryList { get; set; } ... } } It has Category and CategoryList properties. The CategoryList is the source data for the Category dropdown UI element. I have an EditorTemplate that looks like this: <%@ Control Language="C#" Inherits="System.Web.Mvc.ViewUserControl<ProductViewModel>" %> <%@ Import Namespace="AutoForm.Models"%> <%=Html.DropDownListFor(m => m.Category , Model.CategoryList ) %> NOTE: this EditorTemplate is strongly typed to ProductViewModel My Controller is populating CategoryList property with data from a database. I cannot get the DropDownListFor template to render a drop down list with data from CategoryList. I know CategoryList is getting populated with data in the controller because I see the data when I debug and step through the controller. Here's my error message in the browser: Server Error in '/' Application. Object reference not set to an instance of an object. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.NullReferenceException: Object reference not set to an instance of an object. Source Error: Line 2: <%@ Import Namespace="AutoForm.Models"% Line 3: Line 4: <%=Html.DropDownListFor(m = m.Category, Model.CategoryList) % Source File: c:\ProjectStore\AutoForm\AutoForm\Views\Shared\EditorTemplates\DropDownList.ascx Line: 4 Any ideas? Thanks Tom As a followup, I noticed that ViewData.Model is null when I'm stepping through the code in the EditorTemplate. I have the EditorTemplate strongly typed to "ProductViewModel" which is also the type that's passed to the View in the controller. I'm perplexed as to why ViewData.Model is null even though it's getting populated in the controller before getting passed to the view.

    Read the article

  • WCF wsHttpBinding "There was no channel that could accept the message with action"

    - by Steffen Schindler
    I have a webservice in IIS. I'm trying to call a function but i get an errormessage like: There was no channel that could accept the message with action 'http://Datenlotsen.Cyquest/ICyquestService/ValidateSelfAssessment' I'm hosting it in an IIS in the standard website. There I created a virtual directory named "CyQuestwebservice". For the client side config i'm using Soap UI. That Tool generates the client config from the wsdl. my webconfig looks like this, can you help me?: <system.serviceModel> <extensions> <behaviorExtensions> <add name="wsdlExtensions" type="WCFExtras.Wsdl.WsdlExtensionsConfig, WCFExtras, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null" /> </behaviorExtensions> </extensions> <services> <service behaviorConfiguration="CyquestWebService.Service1Behavior" name="CyquestWebService.CyquestService"> <endpoint address="" behaviorConfiguration="EndPointBehavior" binding="wsHttpBinding" bindingNamespace="http://Datenlotsen.Cyquest" contract="CyquestWebService.ICyquestService"> <identity> <dns value="localhost" /> </identity> </endpoint> <endpoint address="mex" binding="mexHttpBinding" bindingNamespace="http://Datenlotsen.Cyquest" contract="IMetadataExchange" /> </service> </services> <behaviors> <endpointBehaviors> <behavior name="EndPointBehavior" > <wsdlExtensions location="http://wssdev04.datenlotsen.intern/Cyquestwebservice/CyquestService.svc" singleFile="True"/> </behavior> </endpointBehaviors> <serviceBehaviors> <behavior name="CyquestWebService.Service1Behavior"> <serviceMetadata httpGetEnabled="true" /> <serviceDebug includeExceptionDetailInFaults="true" /> </behavior> </serviceBehaviors> </behaviors> </system.serviceModel> <system.diagnostics> <sources> <source name="System.ServiceModel" switchValue="Information, ActivityTracing" propagateActivity="true"> <listeners> <add name="traceListener" type="System.Diagnostics.XmlWriterTraceListener" initializeData= "c:\log\Traces.svclog" /> </listeners> </source> </sources> </system.diagnostics> </configuration>

    Read the article

  • facebook application using iframe on Facebook Developer Toolkit 3.0

    - by adveb
    hey i am trying to build facebook iframe application using the Facebook Developer Toolkit 3.01 asp.net c#. i am working by the ifrmae sample of the toolkit can be download here. www.facebooktoolkit.codeplex.com/releases/view/39727 this is my facebook application that is the same as the iframe sample. http://apps.facebook.com/alefbet/ this is my code, it has 2 pages, master page and default. this 2 pages are the same as the iframe sample. 1) this is the master page. public partial class IFrameMaster : Facebook.Web.CanvasIFrameMasterPage { public IFrameMaster() { RequireLogin = true; } } 2) this is the default.aspx public partial class Default : System.Web.UI.Page { private const string SCRIPT_BLOCK_NAME = "dynamicScript"; protected void Page_Load(object sender, EventArgs e) { if (IsPostBack) { if (Master.Api.Users.HasAppPermission(Enums.ExtendedPermissions.email)) { SendThankYouEmail(); } Response.Redirect("ThankYou.aspx"); } else { if (Master.Api.Users.HasAppPermission(Enums.ExtendedPermissions.email)) { emailPermissionPanel.Visible = false; } CreateScript(); } } private void SendThankYouEmail() { var subject = "Thank you for telling us your favorite color"; var body = "Thank you for telling us what your favorite color is. We hope you have enjoyed using this application. Encourage your friends to tell us their favorite color as well!"; this.Master.Api.Notifications.SendEmail(this.Master.Api.Session.UserId.ToString(), subject, body, string.Empty); } private void CreateScript() { var saveColorScript = @" function saveColor(color) { document.getElementById('" + colorInput.ClientID + @"').value = color; } function submitForm() { document.getElementById('" + form.ClientID + @"').submit(); } "; if (!ClientScript.IsClientScriptBlockRegistered(SCRIPT_BLOCK_NAME)) { ClientScript.RegisterClientScriptBlock(this.GetType(), SCRIPT_BLOCK_NAME, saveColorScript); } } } my directory structure is 1)the master page is in the root. 2)the default.aspx is in the root/alfbet directory. 3)i have also have the xd_receiver.htm inside root/channel directory. that inside the master page their is the folowing line: <script type="text/javascript"> FB_RequireFeatures(["XFBML"], function() { FB.Facebook.init("c81f17ee4d4ffc5113c55f8b99fdcab5", "channel/xd_receiver.htm"); }); </script> the problem is that the applicatin dosent work apps.facebook.com/alefbet/default.aspx why it dosent work ? please help me and others who also obstacle in this issue. i tryied lots of things, one of them was to display the user id. for that i put label in the default.aspx and wrote lblTest.Text = Master.Api.Users.GetInfo().uid.ToString(); and it dosent event get to this line. i know it because it keeps display in the label.text the word "label" thank you very much.

    Read the article

  • JsTree v1.0 - How to manipulate effectively the data from the backend to render the trees and operate correctly?

    - by Jean Paul
    Backend info: PHP 5 / MySQL URL: http://github.com/downloads/vakata/jstree/jstree_pre1.0_fix_1.zip Table structure for table discussions_tree -- CREATE TABLE IF NOT EXISTS `discussions_tree` ( `id` int(11) NOT NULL AUTO_INCREMENT, `parent_id` int(11) NOT NULL DEFAULT '0', `user_id` int(11) NOT NULL DEFAULT '0', `label` varchar(16) DEFAULT NULL, `position` bigint(20) unsigned NOT NULL DEFAULT '0', `left` bigint(20) unsigned NOT NULL DEFAULT '0', `right` bigint(20) unsigned NOT NULL DEFAULT '0', `level` bigint(20) unsigned NOT NULL DEFAULT '0', `type` varchar(255) CHARACTER SET utf8 COLLATE utf8_unicode_ci DEFAULT NULL, `h_label` varchar(16) NOT NULL DEFAULT '', `fulllabel` varchar(255) DEFAULT NULL, UNIQUE KEY `uidx_3` (`id`), KEY `idx_1` (`user_id`), KEY `idx_2` (`parent_id`) ) ENGINE=MyISAM DEFAULT CHARSET=latin1 AUTO_INCREMENT=8 ; /*The first element should in my understanding not even be shown*/ INSERT INTO `discussions_tree` (`id`, `parent_id`, `user_id`, `label`, `position`, `left`, `right`, `level`, `type`, `h_label`, `fulllabel`) VALUES (0, 0, 0, 'Contacts', 0, 1, 1, 0, NULL, '', NULL); INSERT INTO `discussions_tree` (`id`, `parent_id`, `user_id`, `label`, `position`, `left`, `right`, `level`, `type`, `h_label`, `fulllabel`) VALUES (1, 0, 0, 'How to Tag', 1, 2, 2, 0, 'drive', '', NULL); Front End : I've simplified the logic, it has 6 trees actually inside of a panel and that works fine $array = array("Discussions"); $id_arr = array("d"); $nid = 0; foreach ($array as $k=> $value) { $nid++; ?> <li id="<?=$value?>" class="label"> <a href='#<?=$value?>'><span> <?=$value?> </span></a> <div class="sub-menu" style="height:auto; min-height:120px; background-color:#E5E5E5" > <div class="menu" id="menu_<?=$id_arr[$k]?>" style="position:relative; margin-left:56%"> <img src="./js/jsTree/create.png" alt="" id="create" title="Create" > <img src="./js/jsTree/rename.png" alt="" id="rename" title="Rename" > <img src="./js/jsTree/remove.png" alt="" id="remove" title="Delete"> <img src="./js/jsTree/cut.png" alt="" id="cut" title="Cut" > <img src="./js/jsTree/copy.png" alt="" id="copy" title="Copy"> <img src="./js/jsTree/paste.png" alt="" id="paste" title="Paste"> </div> <div id="<?=$id_arr[$k]?>" class="jstree_container"></div> </div> </li> <!-- JavaScript neccessary for this tree : <?=$value?> --> <script type="text/javascript" > jQuery(function ($) { $("#<?=$id_arr[$k]?>").jstree({ // List of active plugins used "plugins" : [ "themes", "json_data", "ui", "crrm" , "hotkeys" , "types" , "dnd", "contextmenu"], // "ui" :{ "initially_select" : ["#node_"+ $nid ] } , "crrm": { "move": { "always_copy": "multitree" }, "input_width_limit":128 }, "core":{ "strings":{ "new_node" : "New Tag" }}, "themes": {"theme": "classic"}, "json_data" : { "ajax" : { "url" : "./js/jsTree/server-<?=$id_arr[$k]?>.php", "data" : function (n) { // the result is fed to the AJAX request `data` option return { "operation" : "get_children", "id" : n.attr ? n.attr("id").replace("node_","") : 1, "state" : "", "user_id": <?=$uid?> }; } } } , "types" : { "max_depth" : -1, "max_children" : -1, "types" : { // The default type "default" : { "hover_node":true, "valid_children" : [ "default" ], }, // The `drive` nodes "drive" : { // can have files and folders inside, but NOT other `drive` nodes "valid_children" : [ "default", "folder" ], "hover_node":true, "icon" : { "image" : "./js/jsTree/root.png" }, // those prevent the functions with the same name to be used on `drive` nodes.. internally the `before` event is used "start_drag" : false, "move_node" : false, "remove_node" : false } } }, "contextmenu" : { "items" : customMenu , "select_node": true} }) //Hover function binded to jstree .bind("hover_node.jstree", function (e, data) { $('ul li[rel="drive"], ul li[rel="default"], ul li[rel=""]').each(function(i) { $(this).find("a").attr('href', $(this).attr("id")+".php" ); }) }) //Create function binded to jstree .bind("create.jstree", function (e, data) { $.post( "./js/jsTree/server-<?=$id_arr[$k]?>.php", { "operation" : "create_node", "id" : data.rslt.parent.attr("id").replace("node_",""), "position" : data.rslt.position, "label" : data.rslt.name, "href" : data.rslt.obj.attr("href"), "type" : data.rslt.obj.attr("rel"), "user_id": <?=$uid?> }, function (r) { if(r.status) { $(data.rslt.obj).attr("id", "node_" + r.id); } else { $.jstree.rollback(data.rlbk); } } ); }) //Remove operation .bind("remove.jstree", function (e, data) { data.rslt.obj.each(function () { $.ajax({ async : false, type: 'POST', url: "./js/jsTree/server-<?=$id_arr[$k]?>.php", data : { "operation" : "remove_node", "id" : this.id.replace("node_",""), "user_id": <?=$uid?> }, success : function (r) { if(!r.status) { data.inst.refresh(); } } }); }); }) //Rename operation .bind("rename.jstree", function (e, data) { data.rslt.obj.each(function () { $.ajax({ async : true, type: 'POST', url: "./js/jsTree/server-<?=$id_arr[$k]?>.php", data : { "operation" : "rename_node", "id" : this.id.replace("node_",""), "label" : data.rslt.new_name, "user_id": <?=$uid?> }, success : function (r) { if(!r.status) { data.inst.refresh(); } } }); }); }) //Move operation .bind("move_node.jstree", function (e, data) { data.rslt.o.each(function (i) { $.ajax({ async : false, type: 'POST', url: "./js/jsTree/server-<?=$id_arr[$k]?>.php", data : { "operation" : "move_node", "id" : $(this).attr("id").replace("node_",""), "ref" : data.rslt.cr === -1 ? 1 : data.rslt.np.attr("id").replace("node_",""), "position" : data.rslt.cp + i, "label" : data.rslt.name, "copy" : data.rslt.cy ? 1 : 0, "user_id": <?=$uid?> }, success : function (r) { if(!r.status) { $.jstree.rollback(data.rlbk); } else { $(data.rslt.oc).attr("id", "node_" + r.id); if(data.rslt.cy && $(data.rslt.oc).children("UL").length) { data.inst.refresh(data.inst._get_parent(data.rslt.oc)); } } } }); }); }); // This is for the context menu to bind with operations on the right clicked node function customMenu(node) { // The default set of all items var control; var items = { createItem: { label: "Create", action: function (node) { return {createItem: this.create(node) }; } }, renameItem: { label: "Rename", action: function (node) { return {renameItem: this.rename(node) }; } }, deleteItem: { label: "Delete", action: function (node) { return {deleteItem: this.remove(node) }; }, "separator_after": true }, copyItem: { label: "Copy", action: function (node) { $(node).addClass("copy"); return {copyItem: this.copy(node) }; } }, cutItem: { label: "Cut", action: function (node) { $(node).addClass("cut"); return {cutItem: this.cut(node) }; } }, pasteItem: { label: "Paste", action: function (node) { $(node).addClass("paste"); return {pasteItem: this.paste(node) }; } } }; // We go over all the selected items as the context menu only takes action on the one that is right clicked $.jstree._reference("#<?=$id_arr[$k]?>").get_selected(false, true).each(function(index,element) { if ( $(element).attr("id") != $(node).attr("id") ) { // Let's deselect all nodes that are unrelated to the context menu -- selected but are not the one right clicked $("#<?=$id_arr[$k]?>").jstree("deselect_node", '#'+$(element).attr("id") ); } }); //if any previous click has the class for copy or cut $("#<?=$id_arr[$k]?>").find("li").each(function(index,element) { if ($(element) != $(node) ) { if( $(element).hasClass("copy") || $(element).hasClass("cut") ) control=1; } else if( $(node).hasClass("cut") || $(node).hasClass("copy")) { control=0; } }); //only remove the class for cut or copy if the current operation is to paste if($(node).hasClass("paste") ) { control=0; // Let's loop through all elements and try to find if the paste operation was done already $("#<?=$id_arr[$k]?>").find("li").each(function(index,element) { if( $(element).hasClass("copy") ) $(this).removeClass("copy"); if ( $(element).hasClass("cut") ) $(this).removeClass("cut"); if ( $(element).hasClass("paste") ) $(this).removeClass("paste"); }); } switch (control) { //Remove the paste item from the context menu case 0: switch ($(node).attr("rel")) { case "drive": delete items.renameItem; delete items.deleteItem; delete items.cutItem; delete items.copyItem; delete items.pasteItem; break; case "default": delete items.pasteItem; break; } break; //Remove the paste item from the context menu only on the node that has either copy or cut added class case 1: if( $(node).hasClass("cut") || $(node).hasClass("copy") ) { switch ($(node).attr("rel")) { case "drive": delete items.renameItem; delete items.deleteItem; delete items.cutItem; delete items.copyItem; delete items.pasteItem; break; case "default": delete items.pasteItem; break; } } else //Re-enable it on the clicked node that does not have the cut or copy class { switch ($(node).attr("rel")) { case "drive": delete items.renameItem; delete items.deleteItem; delete items.cutItem; delete items.copyItem; break; } } break; //initial state don't show the paste option on any node default: switch ($(node).attr("rel")) { case "drive": delete items.renameItem; delete items.deleteItem; delete items.cutItem; delete items.copyItem; delete items.pasteItem; break; case "default": delete items.pasteItem; break; } break; } return items; } $("#menu_<?=$id_arr[$k]?> img").hover( function () { $(this).css({'cursor':'pointer','outline':'1px double teal'}) }, function () { $(this).css({'cursor':'none','outline':'1px groove transparent'}) } ); $("#menu_<?=$id_arr[$k]?> img").click(function () { switch(this.id) { //Create only the first element case "create": if ( $.jstree._reference("#<?=$id_arr[$k]?>").get_selected(false, true).length ) { $.jstree._reference("#<?=$id_arr[$k]?>").get_selected(false, true).each(function(index,element){ switch(index) { case 0: $("#<?=$id_arr[$k]?>").jstree("create", '#'+$(element).attr("id"), null, /*{attr : {href: '#' }}*/null ,null, false); break; default: $("#<?=$id_arr[$k]?>").jstree("deselect_node", '#'+$(element).attr("id") ); break; } }); } else { $.facebox('<p class=\'p_inner error bold\'>A selection needs to be made to work with this operation'); setTimeout(function(){ $.facebox.close(); }, 2000); } break; //REMOVE case "remove": if ( $.jstree._reference("#<?=$id_arr[$k]?>").get_selected(false, true).length ) { $.jstree._reference("#<?=$id_arr[$k]?>").get_selected(false, true).each(function(index,element){ //only execute if the current node is not the first one (drive) if( $(element).attr("id") != $("div.jstree > ul > li").first().attr("id") ) { $("#<?=$id_arr[$k]?>").jstree("remove",'#'+$(element).attr("id")); } else $("#<?=$id_arr[$k]?>").jstree("deselect_node", '#'+$(element).attr("id") ); }); } else { $.facebox('<p class=\'p_inner error bold\'>A selection needs to be made to work with this operation'); setTimeout(function(){ $.facebox.close(); }, 2000); } break; //RENAME NODE only one selection case "rename": if ( $.jstree._reference("#<?=$id_arr[$k]?>").get_selected(false, true).length ) { $.jstree._reference("#<?=$id_arr[$k]?>").get_selected(false, true).each(function(index,element){ if( $(element).attr("id") != $("div.jstree > ul > li").first().attr("id") ) { switch(index) { case 0: $("#<?=$id_arr[$k]?>").jstree("rename", '#'+$(element).attr("id") ); break; default: $("#<?=$id_arr[$k]?>").jstree("deselect_node", '#'+$(element).attr("id") ); break; } } else $("#<?=$id_arr[$k]?>").jstree("deselect_node", '#'+$(element).attr("id") ); }); } else { $.facebox('<p class=\'p_inner error bold\'>A selection needs to be made to work with this operation'); setTimeout(function(){ $.facebox.close(); }, 2000); } break; //Cut case "cut": if ( $.jstree._reference("#<?=$id_arr[$k]?>").get_selected(false, true).length ) { $.jstree._reference("#<?=$id_arr[$k]?>").get_selected(false, true).each(function(index,element){ switch(index) { case 0: $("#<?=$id_arr[$k]?>").jstree("cut", '#'+$(element).attr("id")); $.facebox('<p class=\'p_inner teal\'>Operation "Cut" successfully done.<p class=\'p_inner teal bold\'>Where to place it?'); setTimeout(function(){ $.facebox.close(); $("#<?=$id_arr[$k]?>").jstree("deselect_node", '#'+$(element).attr("id")); }, 2000); break; default: $("#<?=$id_arr[$k]?>").jstree("deselect_node", '#'+$(element).attr("id") ); break; } }); } else { $.facebox('<p class=\'p_inner error bold\'>A selection needs to be made to work with this operation'); setTimeout(function(){ $.facebox.close(); }, 2000); } break; //Copy case "copy": if ( $.jstree._reference("#<?=$id_arr[$k]?>").get_selected(false, true).length ) { $.jstree._reference("#<?=$id_arr[$k]?>").get_selected(false, true).each(function(index,element){ switch(index) { case 0: $("#<?=$id_arr[$k]?>").jstree("copy", '#'+$(element).attr("id")); $.facebox('<p class=\'p_inner teal\'>Operation "Copy": Successfully done.<p class=\'p_inner teal bold\'>Where to place it?'); setTimeout(function(){ $.facebox.close(); $("#<?=$id_arr[$k]?>").jstree("deselect_node", '#'+$(element).attr("id") ); }, 2000); break; default: $("#<?=$id_arr[$k]?>").jstree("deselect_node", '#'+$(element).attr("id") ); break; } }); } else { $.facebox('<p class=\'p_inner error bold\'>A selection needs to be made to work with this operation'); setTimeout(function(){ $.facebox.close(); }, 2000); } break; case "paste": if ( $.jstree._reference("#<?=$id_arr[$k]?>").get_selected(false, true).length ) { $.jstree._reference("#<?=$id_arr[$k]?>").get_selected(false, true).each(function(index,element){ switch(index) { case 0: $("#<?=$id_arr[$k]?>").jstree("paste", '#'+$(element).attr("id")); break; default: $("#<?=$id_arr[$k]?>").jstree("deselect_node", '#'+$(element).attr("id") ); break; } }); } else { $.facebox('<p class=\'p_inner error bold\'>A selection needs to be made to work with this operation'); setTimeout(function(){ $.facebox.close(); }, 2000); } break; } }); <? } ?> server.php $path='../../../..'; require_once "$path/phpfoo/dbif.class"; require_once "$path/global.inc"; // Database config & class $db_config = array( "servername"=> $dbHost, "username" => $dbUser, "password" => $dbPW, "database" => $dbName ); if(extension_loaded("mysqli")) require_once("_inc/class._database_i.php"); else require_once("_inc/class._database.php"); //Tree class require_once("_inc/class.ctree.php"); $dbLink = new dbif(); $dbErr = $dbLink->connect($dbName,$dbUser,$dbPW,$dbHost); $jstree = new json_tree(); if(isset($_GET["reconstruct"])) { $jstree->_reconstruct(); die(); } if(isset($_GET["analyze"])) { echo $jstree->_analyze(); die(); } $table = '`discussions_tree`'; if($_REQUEST["operation"] && strpos($_REQUEST["operation"], "_") !== 0 && method_exists($jstree, $_REQUEST["operation"])) { foreach($_REQUEST as $k => $v) { switch($k) { case 'user_id': //We are passing the user_id from the $_SESSION on each request and trying to pick up the min and max value from the table that matches the 'user_id' $sql = "SELECT max(`right`) , min(`left`) FROM $table WHERE `user_id`=$v"; //If the select does not return any value then just let it be :P if (!list($right, $left)=$dbLink->getRow($sql)) { $sql = $dbLink->dbSubmit("UPDATE $table SET `user_id`=$v WHERE `id` = 1 AND `parent_id` = 0"); $sql = $dbLink->dbSubmit("UPDATE $table SET `user_id`=$v WHERE `parent_id` = 1 AND `label`='How to Tag' "); } else { $sql = $dbLink->dbSubmit("UPDATE $table SET `user_id`=$v, `right`=$right+2 WHERE `id` = 1 AND `parent_id` = 0"); $sql = $dbLink->dbSubmit("UPDATE $table SET `user_id`=$v, `left`=$left+1, `right`=$right+1 WHERE `parent_id` = 1 AND `label`='How to Tag' "); } break; } } header("HTTP/1.0 200 OK"); header('Content-type: application/json; charset=utf-8'); header("Cache-Control: no-cache, must-revalidate"); header("Expires: Mon, 26 Jul 1997 05:00:00 GMT"); header("Pragma: no-cache"); echo $jstree->{$_REQUEST["operation"]}($_REQUEST); die(); } header("HTTP/1.0 404 Not Found"); ?> The problem: DND *(Drag and Drop) works, Delete works, Create works, Rename works, but Copy, Cut and Paste don't work

    Read the article

  • Migrating JSF 1.2 to 2.x resulted in java.lang.NullPointerException at com.sun.faces.config.InitFacesContext.cleanupInitMaps

    - by nudastack
    I'm in process of migrating a JSF 1.2 application to JSF 2.x. My application is using the following APIs: maven-jetty-plugin (version 6.1.10) maven-compiler-plugin (version 2.3.1) myfaces-api & -impl (version 1.2.7) jsf-facelets (version 1.1.14) richfaces (version 3.3.3) javax servlet jstl (version 1.2) javax servlet-api (version 2.5) and some other plugins as well. I updated the versions in faces-config.xml file and removed Facelets 1.1.14 libs and replaced MyFaces 1.2 by JSF 2.x libs, however the application didn't work. It threw the following exception: Could not instantiate listener org.apache.myfaces.webapp.StartupServletContextListener java.lang.ClassNotFoundException: org.apache.myfaces.webapp.StartupServletContextListener at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:50) at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:244) at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:230) at org.mortbay.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:375) at org.mortbay.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:337) at org.mortbay.jetty.handler.ContextHandler.loadClass(ContextHandler.java:1035) at org.mortbay.jetty.webapp.WebXmlConfiguration.initListener(WebXmlConfiguration.java:629) at org.mortbay.jetty.webapp.WebXmlConfiguration.initWebXmlElement(WebXmlConfiguration.java:367) at org.mortbay.jetty.plus.webapp.AbstractConfiguration.initWebXmlElement(AbstractConfiguration.java:190) at org.mortbay.jetty.webapp.WebXmlConfiguration.initialize(WebXmlConfiguration.java:289) at org.mortbay.jetty.plus.webapp.AbstractConfiguration.initialize(AbstractConfiguration.java:133) at org.mortbay.jetty.webapp.WebXmlConfiguration.configure(WebXmlConfiguration.java:222) at org.mortbay.jetty.plus.webapp.AbstractConfiguration.configure(AbstractConfiguration.java:113) at org.mortbay.jetty.webapp.WebXmlConfiguration.configureWebApp(WebXmlConfiguration.java:180) at org.mortbay.jetty.plus.webapp.AbstractConfiguration.configureWebApp(AbstractConfiguration.java:96) at org.mortbay.jetty.plus.webapp.Configuration.configureWebApp(Configuration.java:124) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1217) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:510) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448) at org.mortbay.jetty.plugin.Jetty6PluginWebAppContext.doStart(Jetty6PluginWebAppContext.java:110) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130) at org.mortbay.jetty.Server.doStart(Server.java:222) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) at org.mortbay.jetty.plugin.Jetty6PluginServer.start(Jetty6PluginServer.java:132) at org.mortbay.jetty.plugin.AbstractJettyMojo.startJetty(AbstractJettyMojo.java:371) at org.mortbay.jetty.plugin.AbstractJettyMojo.execute(AbstractJettyMojo.java:307) at org.mortbay.jetty.plugin.AbstractJettyRunMojo.execute(AbstractJettyRunMojo.java:203) at org.mortbay.jetty.plugin.Jetty6RunMojo.execute(Jetty6RunMojo.java:184) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59) at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156) at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537) at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196) at org.apache.maven.cli.MavenCli.main(MavenCli.java:141) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290) at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409) at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352) at org.codehaus.classworlds.Launcher.main(Launcher.java:47) 2013-07-02 14:39:15.590::WARN: Unknown realm: default 2013-07-02 14:39:15.651::INFO: No Transaction manager found - if your webapp requires one, please configure one. SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See StaticLoggerBinder for further details. Tem 02, 2013 2:39:22 PM com.sun.faces.config.ConfigureListener contextInitialized INFO: Initializing Mojarra 2.2.0 ( 20130502-2118 https://svn.java.net/svn/mojarra~svn/tags/2.2.0@11930) for context '' Tem 02, 2013 2:39:22 PM com.sun.faces.config.ConfigureListener contextInitialized WARNING: JSF1059: WARNING! The com.sun.faces.verifyObjects feature is to aid developers not using tools. It shouldn''t be enabled if using an IDE, or if this application is being deployed for production as it will impact application start times. log4j:WARN No appenders could be found for logger (org.ajax4jsf.renderkit.ChameleonRenderKitFactory). log4j:WARN Please initialize the log4j system properly. log4j:WARN See #noconfig for more info. Tem 02, 2013 2:39:26 PM com.sun.faces.config.ConfigureListener contextInitialized SEVERE: Critical error during deployment: java.lang.NoSuchMethodError: org.mortbay.jetty.annotations.AnnotationParser.parseAnnotations(Lorg/mortbay/jetty/webapp/WebAppContext;Ljava/lang/Class;Lorg/mortbay/jetty/plus/annotation/RunAsCollection;Lorg/mortbay/jetty/plus/annotation/InjectionCollection;Lorg/mortbay/jetty/plus/annotation/LifeCycleCallbackCollection;)V at com.sun.faces.vendor.Jetty6InjectionProvider.inject(Jetty6InjectionProvider.java:93) at javax.faces.FactoryFinder.getImplGivenPreviousImpl(FactoryFinder.java:695) at javax.faces.FactoryFinder.getImplementationInstance(FactoryFinder.java:572) at javax.faces.FactoryFinder.access$500(FactoryFinder.java:140) at javax.faces.FactoryFinder$FactoryManager.getFactory(FactoryFinder.java:1120) at javax.faces.FactoryFinder.getFactory(FactoryFinder.java:379) at com.sun.faces.config.processor.FactoryConfigProcessor.verifyFactoriesExist(FactoryConfigProcessor.java:328) at com.sun.faces.config.processor.FactoryConfigProcessor.process(FactoryConfigProcessor.java:236) at com.sun.faces.config.ConfigManager.initialize(ConfigManager.java:435) at com.sun.faces.config.ConfigureListener.contextInitialized(ConfigureListener.java:214) at org.mortbay.jetty.handler.ContextHandler.startContext(ContextHandler.java:540) at org.mortbay.jetty.servlet.Context.startContext(Context.java:135) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1220) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:510) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448) at org.mortbay.jetty.plugin.Jetty6PluginWebAppContext.doStart(Jetty6PluginWebAppContext.java:110) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130) at org.mortbay.jetty.Server.doStart(Server.java:222) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) at org.mortbay.jetty.plugin.Jetty6PluginServer.start(Jetty6PluginServer.java:132) at org.mortbay.jetty.plugin.AbstractJettyMojo.startJetty(AbstractJettyMojo.java:371) at org.mortbay.jetty.plugin.AbstractJettyMojo.execute(AbstractJettyMojo.java:307) at org.mortbay.jetty.plugin.AbstractJettyRunMojo.execute(AbstractJettyRunMojo.java:203) at org.mortbay.jetty.plugin.Jetty6RunMojo.execute(Jetty6RunMojo.java:184) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59) at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156) at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537) at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196) at org.apache.maven.cli.MavenCli.main(MavenCli.java:141) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290) at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409) at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352) at org.codehaus.classworlds.Launcher.main(Launcher.java:47) 2013-07-02 14:39:26.652::WARN: Failed startup of context org.mortbay.jetty.plugin.Jetty6PluginWebAppContext@64dacd55{/,C:\Users\zmn\workspace\tracker-web\src\main\webapp} java.lang.RuntimeException: java.lang.NoSuchMethodError: org.mortbay.jetty.annotations.AnnotationParser.parseAnnotations(Lorg/mortbay/jetty/webapp/WebAppContext;Ljava/lang/Class;Lorg/mortbay/jetty/plus/annotation/RunAsCollection;Lorg/mortbay/jetty/plus/annotation/InjectionCollection;Lorg/mortbay/jetty/plus/annotation/LifeCycleCallbackCollection;)V at com.sun.faces.config.ConfigureListener.contextInitialized(ConfigureListener.java:273) at org.mortbay.jetty.handler.ContextHandler.startContext(ContextHandler.java:540) at org.mortbay.jetty.servlet.Context.startContext(Context.java:135) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1220) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:510) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448) at org.mortbay.jetty.plugin.Jetty6PluginWebAppContext.doStart(Jetty6PluginWebAppContext.java:110) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130) at org.mortbay.jetty.Server.doStart(Server.java:222) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) at org.mortbay.jetty.plugin.Jetty6PluginServer.start(Jetty6PluginServer.java:132) at org.mortbay.jetty.plugin.AbstractJettyMojo.startJetty(AbstractJettyMojo.java:371) at org.mortbay.jetty.plugin.AbstractJettyMojo.execute(AbstractJettyMojo.java:307) at org.mortbay.jetty.plugin.AbstractJettyRunMojo.execute(AbstractJettyRunMojo.java:203) at org.mortbay.jetty.plugin.Jetty6RunMojo.execute(Jetty6RunMojo.java:184) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59) at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156) at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537) at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196) at org.apache.maven.cli.MavenCli.main(MavenCli.java:141) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290) at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409) at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352) at org.codehaus.classworlds.Launcher.main(Launcher.java:47) Caused by: java.lang.NoSuchMethodError: org.mortbay.jetty.annotations.AnnotationParser.parseAnnotations(Lorg/mortbay/jetty/webapp/WebAppContext;Ljava/lang/Class;Lorg/mortbay/jetty/plus/annotation/RunAsCollection;Lorg/mortbay/jetty/plus/annotation/InjectionCollection;Lorg/mortbay/jetty/plus/annotation/LifeCycleCallbackCollection;)V at com.sun.faces.vendor.Jetty6InjectionProvider.inject(Jetty6InjectionProvider.java:93) at javax.faces.FactoryFinder.getImplGivenPreviousImpl(FactoryFinder.java:695) at javax.faces.FactoryFinder.getImplementationInstance(FactoryFinder.java:572) at javax.faces.FactoryFinder.access$500(FactoryFinder.java:140) at javax.faces.FactoryFinder$FactoryManager.getFactory(FactoryFinder.java:1120) at javax.faces.FactoryFinder.getFactory(FactoryFinder.java:379) at com.sun.faces.config.processor.FactoryConfigProcessor.verifyFactoriesExist(FactoryConfigProcessor.java:328) at com.sun.faces.config.processor.FactoryConfigProcessor.process(FactoryConfigProcessor.java:236) at com.sun.faces.config.ConfigManager.initialize(ConfigManager.java:435) at com.sun.faces.config.ConfigureListener.contextInitialized(ConfigureListener.java:214) ... 42 more How is this caused and how can I solve it? Here is the web.xml content: <context-param> <param-name>log4jConfigLocation</param-name> <param-value>WEB-INF/log4j.properties</param-value> </context-param> <welcome-file-list> <welcome-file>portal/html/index.xhtml</welcome-file> </welcome-file-list> <context-param> <param-name>org.apache.myfaces.ERROR_HANDLING</param-name> <param-value>false</param-value> </context-param> <context-param> <param-name>javax.faces.DEFAULT_SUFFIX</param-name> <param-value>.xhtml</param-value> </context-param> <!-- Special Debug Output for Development --> <context-param> <param-name>facelets.DEVELOPMENT</param-name> <param-value>false</param-value> </context-param> <context-param> <param-name>facelets.REFRESH_PERIOD</param-name> <param-value>2</param-value> </context-param> <context-param> <param-name>com.sun.faces.validateXml</param-name> <param-value>true</param-value> </context-param> <context-param> <param-name>com.sun.faces.verifyObjects</param-name> <param-value>true</param-value> </context-param> <context-param> <param-name>javax.faces.DATETIMECONVERTER_DEFAULT_TIMEZONE_IS_SYSTEM_TIMEZONE</param-name> <param-value>true</param-value> </context-param> <!-- Richfaces --> <context-param> <param-name>javax.faces.STATE_SAVING_METHOD</param-name> <param-value>server</param-value> </context-param> <context-param> <param-name>org.richfaces.SKIN</param-name> <param-value>tracker</param-value> </context-param> <!-- Richfaces end --> <context-param> <param-name>org.ajax4jsf.VIEW_HANDLERS</param-name> <param-value>com.sun.facelets.FaceletViewHandler</param-value> </context-param> <filter> <display-name>RichFaces Filter</display-name> <filter-name>richfaces</filter-name> <filter-class>org.ajax4jsf.Filter</filter-class> <init-param> <param-name>createTempFiles</param-name> <param-value>false</param-value> </init-param> <init-param> <param-name>maxRequestSize</param-name> <param-value>512000</param-value> </init-param> </filter> <filter-mapping> <filter-name>richfaces</filter-name> <servlet-name>Faces Servlet</servlet-name> <dispatcher>REQUEST</dispatcher> <dispatcher>FORWARD</dispatcher> <dispatcher>INCLUDE</dispatcher> </filter-mapping> <listener> <listener-class>org.apache.myfaces.webapp.StartupServletContextListener</listener-class> </listener> <listener> <listener-class>com.omega.src.util.InitializeListener</listener-class> </listener> <servlet> <servlet-name>Faces Servlet</servlet-name> <servlet-class>javax.faces.webapp.FacesServlet</servlet-class> <load-on-startup>1</load-on-startup> </servlet> <servlet-mapping> <servlet-name>Faces Servlet</servlet-name> <url-pattern>*.xhtml</url-pattern> </servlet-mapping> <welcome-file-list> <welcome-file>loginPage.xhtml</welcome-file> </welcome-file-list> <login-config> <auth-method>BASIC</auth-method> </login-config> and dependencies in pom.xml content: <dependencies> <dependency> <groupId>com.sun.faces</groupId> <artifactId>jsf-api</artifactId> <version>2.2.0</version> </dependency> <dependency> <groupId>com.sun.faces</groupId> <artifactId>jsf-impl</artifactId> <version>2.2.0</version> </dependency> <dependency> <groupId>javax.servlet</groupId> <artifactId>jstl</artifactId> <version>1.2</version> </dependency> <dependency> <groupId>javax.servlet</groupId> <artifactId>servlet-api</artifactId> <version>2.5</version> </dependency> <dependency> <groupId>org.richfaces.framework</groupId> <artifactId>richfaces-api</artifactId> <version>3.3.3.Final</version> </dependency> <dependency> <groupId>org.richfaces.framework</groupId> <artifactId>richfaces-impl</artifactId> <version>3.3.3.Final</version> </dependency> <dependency> <groupId>org.richfaces.ui</groupId> <artifactId>richfaces-ui</artifactId> <version>3.3.3.Final</version> </dependency> <dependency> <groupId>log4j</groupId> <artifactId>log4j</artifactId> <version>1.2.16</version> </dependency> <dependency> <groupId>org.hibernate</groupId> <artifactId>hibernate-core</artifactId> <version>3.6.8.Final</version> </dependency> <dependency> <groupId>postgresql</groupId> <artifactId>postgresql</artifactId> <version>9.1-901.jdbc4</version> </dependency> <dependency> <groupId>javassist</groupId> <artifactId>javassist</artifactId> <version>3.12.1.GA</version> </dependency> <dependency> <groupId>com.googlecode.gmaps4jsf</groupId> <artifactId>gmaps4jsf-core</artifactId> <version>1.1.4</version> </dependency> <dependency> <groupId>net.sf.jasperreports</groupId> <artifactId>jasperreports</artifactId> <version>4.5.0</version> </dependency> <dependency> <groupId>net.sourceforge.jexcelapi</groupId> <artifactId>jxl</artifactId> <version>2.6.12</version> </dependency> </dependencies>

    Read the article

  • System.ServiceModel.Syndication.SyndicationFeed throws when the RSS document includes a <script> blo

    - by Cheeso
    The code: using (XmlReader xmlr = XmlReader.Create(new StringReader(allXml))) { var items = from item in SyndicationFeed.Load(xmlr).Items select item; } The exception: Exception: System.Xml.XmlException: Unexpected node type Element. ReadElementString method can only be called on elements with simple or empty content. Line 11, position 25. at System.Xml.XmlReader.ReadElementString() at System.ServiceModel.Syndication.Rss20FeedFormatter.ReadXml(XmlReader reader, SyndicationFeed result) at System.ServiceModel.Syndication.Rss20FeedFormatter.ReadFeed(XmlReader reader) at System.ServiceModel.Syndication.Rss20FeedFormatter.ReadFrom(XmlReader reader) at System.ServiceModel.Syndication.SyndicationFeed.Load[TSyndicationFeed](XmlReader reader) at System.ServiceModel.Syndication.SyndicationFeed.Load(XmlReader reader) at Ionic.ToolsAndTests.ReadRss.Run() in c:\dev\dotnet\ReadRss.cs:line 90 The XML content: <?xml version="1.0" encoding="utf-8"?> <?xml-stylesheet type="text/xsl" href="https://www.ibm.com/developerworks/mydeveloperworks/blogs/roller-ui/styles/rss.xsl" media="screen"?><rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom" > <channel> <title>Software architecture, software engineering, and Renaissance Jazz</title> <link>https://www.ibm.com/developerworks/mydeveloperworks/blogs/gradybooch</link> <atom:link rel="self" type="application/rss+xml" href="https://www.ibm.com/developerworks/mydeveloperworks/blogs/gradybooch/feed/entries/rss?lang=en" /> <description>Software architecture, software engineering, and Renaissance Jazz</description> <language>en-us</language> <copyright>Copyright <script type='text/javascript'> document.write(blogsDate.date.localize (1273534889181));</script></copyright> <lastBuildDate>Mon, 10 May 2010 19:41:29 -0400</lastBuildDate> As you can see, on line 11, at position 25, there's a script block inside the <copyright> element. Other people have reported similar errors with other XML documents. The way I worked around this was to do a StreamReader.ReadToEnd, then do Regex.Replace on the result of that to yank out the script block, before passing the modified string to XmlReader.Create(). Feels like a hack. Has anyone got a better approach? I don't like this because I have to read in a 125k string into memory. Is it valid rss to include "complex content" like that - a script block inside an element?

    Read the article

  • Unable to run Wix Custom Action in MSI

    - by Grandpappy
    I'm trying to create a custom action for my Wix install, and it's just not working, and I'm unsure why. Here's the bit in the appropriate Wix File: <Binary Id="INSTALLERHELPER" SourceFile=".\Lib\InstallerHelper.dll" /> <CustomAction Id="HelperAction" BinaryKey="INSTALLERHELPER" DllEntry="CustomAction1" Execute="immediate" /> Here's the full class file for my custom action: using Microsoft.Deployment.WindowsInstaller; namespace InstallerHelper { public class CustomActions { [CustomAction] public static ActionResult CustomAction1(Session session) { session.Log("Begin CustomAction1"); return ActionResult.Success; } } } The action is run by a button press in the UI (for now): <Control Id="Next" Type="PushButton" X="248" Y="243" Width="56" Height="17" Default="yes" Text="!(loc.WixUINext)" > <Publish Event="DoAction" Value="HelperAction">1</Publish> </Control> When I run the MSI, I get this error in the log: MSI (c) (08:5C) [10:08:36:978]: Connected to service for CA interface. MSI (c) (08:4C) [10:08:37:030]: Note: 1: 1723 2: SQLHelperAction 3: CustomAction1 4: C:\Users\NATHAN~1.TYL\AppData\Local\Temp\MSI684F.tmp Error 1723. There is a problem with this Windows Installer package. A DLL required for this install to complete could not be run. Contact your support personnel or package vendor. Action SQLHelperAction, entry: CustomAction1, library: C:\Users\NATHAN~1.TYL\AppData\Local\Temp\MSI684F.tmp MSI (c) (08:4C) [10:08:38:501]: Product: SessionWorks :: Judge Edition -- Error 1723. There is a problem with this Windows Installer package. A DLL required for this install to complete could not be run. Contact your support personnel or package vendor. Action SQLHelperAction, entry: CustomAction1, library: C:\Users\NATHAN~1.TYL\AppData\Local\Temp\MSI684F.tmp Action ended 10:08:38: SQLHelperAction. Return value 3. DEBUG: Error 2896: Executing action SQLHelperAction failed. The installer has encountered an unexpected error installing this package. This may indicate a problem with this package. The error code is 2896. The arguments are: SQLHelperAction, , Neither of the two error codes or messages it gives me is enough to tell me what's wrong. Or perhaps I'm just not understanding what they're saying is wrong. At first I thought it might be because I was using Wix 3.5, so just to be sure I tried using Wix 3.0, but I get the same error. Any ideas on what I'm doing wrong?

    Read the article

< Previous Page | 302 303 304 305 306 307 308 309 310 311 312 313  | Next Page >