Search Results

Search found 14214 results on 569 pages for 'enterprise manager'.

Page 552/569 | < Previous Page | 548 549 550 551 552 553 554 555 556 557 558 559  | Next Page >

  • CodePlex Daily Summary for Thursday, September 06, 2012

    CodePlex Daily Summary for Thursday, September 06, 2012Popular Releasesmenu4web: menu4web 0.4.1 - javascript menu for web sites: This release is for those who believe that global variables are evil. menu4web has been wrapped into m4w singleton object. Added "Vertical Tabs" example which illustrates object notation.WinRT XAML Toolkit: WinRT XAML Toolkit - 1.2.1: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features AsyncUI extensions Controls and control extensions Converters Debugging helpers Imaging IO helpers VisualTree helpers Samples Recent changes NOTE: Namespace changes DebugConsol...iPDC - Free Phasor Data Concentrator: iPDC-v1.3.1: iPDC suite version-1.3.1, Modifications and Bug Fixed (from v 1.3.0) New User Manual for iPDC-v1.3.1 available on websites. Bug resolved : PMU Simulator TCP connection error and hang connection for client (PDC). Now PMU Simulator (server) can communicate more than one PDCs (clients) over TCP and UDP parallely. PMU Simulator is now sending the exact data frames as mentioned in data rate by user. PMU Simulator data rate has been verified by iPDC database entries and PMU Connection Tes...Microsoft SQL Server Product Samples: Database: AdventureWorks OData Feed: The AdventureWorks OData service exposes resources based on specific SQL views. The SQL views are a limited subset of the AdventureWorks database that results in several consuming scenarios: CompanySales Documents ManufacturingInstructions ProductCatalog TerritorySalesDrilldown WorkOrderRouting How to install the sample You can consume the AdventureWorks OData feed from http://services.odata.org/AdventureWorksV3/AdventureWorks.svc. You can also consume the AdventureWorks OData fe...Desktop Google Reader: 1.4.6: Sorting feeds alphabetical is now optional (see preferences window)DotNetNuke® Community Edition CMS: 06.02.03: Major Highlights Fixed issue where mailto: links were not working when sending bulk email Fixed issue where uses did not see friendship relationships Problem is in 6.2, which does not show in the Versions Affected list above. Fixed the issue with cascade deletes in comments in CoreMessaging_Notification Fixed UI issue when using a date fields as a required profile property during user registration Fixed error when running the product in debug mode Fixed visibility issue when...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.65: Fixed null-reference error in the build task constructor.Active Forums for DotNetNuke CMS: Active Forums 5.0.0 RC: RC release of Active Forums 5.0.Droid Explorer: Droid Explorer 0.8.8.7 Beta: Bug in the display icon for apk's, will fix with next release Added fallback icon if unable to get the image/icon from the Cloud Service Removed some stale plugins that were either out dated or incomplete. Added handler for *.ab files for restoring backups Added plugin to create device backups Backups stored in %USERPROFILE%\Android Backups\%DEVICE_ID%\ Added custom folder icon for the android backups directory better error handling for installing an apk bug fixes for the Runn...BI System Monitor: v2.1: Data Audits report and supporting SQL, and SSIS package Environment Overview report enhancements, improving the appearance, addition of data audit finding indicators Note: SQL 2012 version coming soon.Hidden Capture (HC): Hidden Capture 1.1: Hidden Capture 1.1 by Mohsen E.Dawatgar http://Hidden-Capture.blogfa.comExt Spec: Ext Spec 0.2.1: Refined examples and improved distribution options.The Visual Guide for Building Team Foundation Server 2012 Environments: Version 1: --Nearforums - ASP.NET MVC forum engine: Nearforums v8.5: Version 8.5 of Nearforums, the ASP.NET MVC Forum Engine. New features include: Built-in search engine using Lucene.NET Flood control improvements Notifications improvements: sync option and mail body View Roadmap for more details webdeploy package sha1 checksum: 961aff884a9187b6e8a86d68913cdd31f8deaf83WiX Toolset: WiX Toolset v3.6: WiX Toolset v3.6 introduces the Burn bootstrapper/chaining engine and support for Visual Studio 2012 and .NET Framework 4.5. Other minor functionality includes: WixDependencyExtension supports dependency checking among MSI packages. WixFirewallExtension supports more features of Windows Firewall. WixTagExtension supports Software Id Tagging. WixUtilExtension now supports recursive directory deletion. Melt simplifies pure-WiX patching by extracting .msi package content and updating .w...Iveely Search Engine: Iveely Search Engine (0.2.0): ????ISE?0.1.0??,?????,ISE?0.2.0?????????,???????,????????20???follow?ISE,????,??ISE??????????,??????????,?????????,?????????0.2.0??????,??????????。 Iveely Search Engine ?0.2.0?????????“??????????”,??????,?????????,???????,???????????????????,????、????????????。???0.1.0????????????: 1. ??“????” ??。??????????,?????????,???????????????????。??:????????,????????????,??????????????????。??????。 2. ??“????”??。?0.1.0??????,???????,???????????????,?????????????,????????,?0.2.0?,???????...GmailDefaultMaker: GmailDefaultMaker 3.0.0.2: Add QQ Mail BugfixSmart Data Access layer: Smart Data access Layer Ver 3: In this version support executing inline query is added. Check Documentation section for detail.DotNetNuke® Form and List: 06.00.04: DotNetNuke Form and List 06.00.04 Don't forget to backup your installation before upgrade. Changes in 06.00.04 Fix: Sql Scripts for 6.003 missed object qualifiers within stored procedures Fix: added missing resource "cmdCancel.Text" in form.ascx.resx Changes in 06.00.03 Fix: MakeThumbnail was broken if the application pool was configured to .Net 4 Change: Data is now stored in nvarchar(max) instead of ntext Changes in 06.00.02 The scripts are now compatible with SQL Azure, tested in a ne...Coevery - Free CRM: Coevery 1.0.0.24: Add a sample database, and installation instructions.New ProjectsAny-Service: AnyService is a .net 4.0 Windows service shell. It hosts any windows application in non-gui mode to run as a service.BabyCloudDrives - the multi cloud drive desktop's application: wpf ????BLACK ORANGE: Download The HPAD TEXT EDITOR and use it Wisely.. CodePlex New Release Checker: CodePlex New Release Checker is a small library that makes it easy to add, "New Version Available!" functionality to your CodePlex project.Collect: ????????!CSVManager: CSV??CSV?????,????CSV??,??????Exam Project: My Exam Project. Computer Vision, C and OpenCV-FTP: Hey guys thanks for checking out my ftp!Haushaltsbuch: 1ModMaker.Lua: ModMaker.Lua is an open source .NET library that parses and executes Lua code.MyJabbr: MyJabbr netduinoscope: Design shield and software to use netduino as oscilloscopeNetSurveillance Web Application: Net Surveillance Web ApplicationNiconicoApiHelper: ????API?????????OStega: A simple library for encrypt text into an bmp or png image.OURORM: ormTFS Cloud Deployment Toolkit: The TFS Cloud Deployment Toolkit is a set of tools that integrate with TFS 2010 to help manage configuration and deployment to various remote environments.The Visual Guide for Building Team Foundation Server 2012 Environments: A step-by-step guide for building Team Foundation Server 2012 environments that include SharePoint Server 2010, SQL Server 2012, Windows Server 2012 and more!WinRT LineChart: An attempt at creating an usable LineChart for everyone to use in his/her own Windows 8 Apps

    Read the article

  • krb5-multidev, libk5crypto3, libk5crypto3:i386 package dependency

    - by TDalton
    Using Ubuntu 12.04 Asus U43F I can no longer update, install or remove packages via software center because of package dependency errors. sudo apt-get install -f reads the following: Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following packages were automatically installed and are no longer required: libunity6 libqapt-runtime libboost-program-options1.46.1 akonadi-backend-mysql libqapt1 shared-desktop-ontologies libntrack0 ntrack-module-libnl-0 libntrack-qt4-1 Use 'apt-get autoremove' to remove them. The following extra packages will be installed: krb5-multidev libk5crypto3:i386 libkrb5-dev Suggested packages: krb5-doc krb5-doc:i386 krb5-user:i386 The following packages will be upgraded: krb5-multidev libk5crypto3:i386 libkrb5-dev 3 upgraded, 0 newly installed, 0 to remove and 325 not upgraded. 11 not fully installed or removed. Need to get 0 B/213 kB of archives. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? y dpkg: error processing libk5crypto3 (--configure): libk5crypto3:amd64 1.10+dfsg~beta1-2ubuntu0.3 cannot be configured because libk5crypto3:i386 is in a different version (1.10+dfsg~beta1-2ubuntu0.1) dpkg: error processing libk5crypto3:i386 (--configure): libk5crypto3:i386 1.10+dfsg~beta1-2ubuntu0.1 cannot be configured because libk5crypto3:amd64 is in a different version (1.10+dfsg~beta1-2ubuntu0.3) dpkg: dependency problems prevent configuration of libkrb5-3: libkrb5-3 depends on libk5crypto3 (>= 1.9+dfsg~beta1); however:No apport report written because MaxReports is reached already Package libk5crypto3 is not configured yet. dpkg: error processing libkrb5-3 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libgssapi-krb5-2: libgssapi-krb5-2 depends on libk5crypto3 (>= 1.8+dfsg); however: Package libk5crypto3 is not configured yet. libgssapi-krb5-2 depends on libkrb5-3 (= 1.10+dfsg~beta1-2ubuntu0.3); however: Package libkrb5-3 is not configured yet. dpkg: error processing libgssapi-krb5-2 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libgssrpc4: libgssrpc4 depends on libgssapi-krb5-2 (>= 1.10+dfsg~); however: Package libgssapi-krb5-2 is not configured yet. dpkg: error processing libgssrpc4 (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already No apport report written because MaxReports is reached already No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of libkadm5srv-mit8: libkadm5srv-mit8 depends on libgssapi-krb5-2 (>= 1.6.dfsg.2); however: Package libgssapi-krb5-2 is not configured yet. libkadm5srv-mit8 depends on libgssrpc4 (>= 1.6.dfsg.2); however: Package libgssrpc4 is not configured yet. libkadm5srv-mit8 depends on libk5crypto3 (>= 1.6.dfsg.2); however: Package libk5crypto3 is not configured yet. libkadm5srv-mit8 depends on libkrb5-3 (>= 1.9+dfsg~beta1); however: Package libkrb5-3 is not configured yet. dpkg: error processing libkadm5srv-mit8 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libkadm5clnt-mit8: libkadm5clnt-mit8 depends on libgssapi-krb5-2 (>= 1.10+dfsg~); however: Package libgssapi-krb5-2 is not configured yet. libkadm5clnt-mit8 depends on libgssrpc4 (>= 1.6.dfsg.2); however: Package libgssrpc4 is not configured yet.No apport report written because MaxReports is reached already libkadm5clnt-mit8 depends on libk5crypto3 (>= 1.6.dfsg.2); however: Package libk5crypto3 is not configured yet. libkadm5clnt-mit8 depends on libkrb5-3 (>= 1.8+dfsg); however: Package libkrb5-3 is not configured yet. dpkg: error processing libkadm5clnt-mit8 (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of krb5-multidev: krb5-multidev depends on libkrb5-3 (= 1.10+dfsg~beta1-2ubuntu0.2); however: Version of libkrb5-3 on system is 1.10+dfsg~beta1-2ubuntu0.3. krb5-multidev depends on libk5crypto3 (= 1.10+dfsg~beta1-2ubuntu0.2); however: Version of libk5crypto3 on system is 1.10+dfsg~beta1-2ubuntu0.3. krb5-multidev depends on libgssapi-krb5-2 (= 1.10+dfsg~beta1-2ubuntu0.2); however: Version of libgssapi-krb5-2 on system is 1.10+dfsg~beta1-2ubuntu0.3. krb5-multidev depends on libgssrpc4 (= 1.10+dfsg~beta1-2ubuntu0.2); however: Version of libgssrpc4 on system is 1.10+dfsg~beta1-2ubuntu0.3. krb5-multidev depends on libkadm5srv-mit8 (= 1.10+dfsg~beta1-2ubuntu0.2); however: Version of libkadm5srv-mit8 on system is 1.10+dfsg~beta1-2ubuntu0.3. krb5-multidev depends on libkadm5clnt-mit8 (= 1.10+dfsg~beta1-2ubuntu0.2); however: Version of libkadm5clnt-mit8 on system is 1.10+dfsg~beta1-2ubuntu0.3. dpkg: error processing krb5-multidev (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of libkrb5-dev: libkrb5-dev depends on krb5-multidev (= 1.10+dfsg~beta1-2ubuntu0.2); however: Package krb5-multidev is not configured yet. dpkg: error processing libkrb5-dev (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of libkrb5-3:i386: libkrb5-3:i386 depends on libk5crypto3 (>= 1.9+dfsg~beta1); however: Package libk5crypto3:i386 is not configured yet. dpkg: error processing libkrb5-3:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libgssapi-krb5-2:i386: libgssapi-krb5-2:i386 depends on libk5crypto3 (>= 1.8+dfsg); however: Package libk5crypto3:i386 is not configured yet. libgssapi-krb5-2:i386 depends on libkrb5-3 (= 1.10+dfsg~beta1-2ubuntu0.3); however: Package libkrb5-3:i386 is not configured yet. dpkg: error processing libgssapi-krb5-2:i386 (--configure): dependency problems - leaving unconfigured Errors were encountered while processing: libk5crypto3 libk5crypto3:i386 libkrb5-3 libgssapi-krb5-2 libgssrpc4 libkadm5srv-mit8 libkadm5clnt-mit8 krb5-multidev libkrb5-dev libkrb5-3:i386 libgssapi-krb5-2:i386 E: Sub-process /usr/bin/dpkg returned an error code (1) I have tried to fix the broken dependencies via synaptic package manager, but it returns with an error: E: libk5crypto3: libk5crypto3:amd64 1.10+dfsg~beta1-2ubuntu0.3 cannot be configured because libk5crypto3 E: libkrb5-3: dependency problems - leaving unconfigured E: libgssapi-krb5-2: dependency problems - leaving unconfigured E: libgssrpc4: dependency problems - leaving unconfigured E: libkadm5srv-mit8: dependency problems - leaving unconfigured E: libkadm5clnt-mit8: dependency problems - leaving unconfigured E: krb5-multidev: dependency problems - leaving unconfigured E: libkrb5-dev: dependency problems - leaving unconfigured I haven't gotten help from ubuntuforums.org on this issue. Please help, obi-wan

    Read the article

  • CodePlex Daily Summary for Monday, August 11, 2014

    CodePlex Daily Summary for Monday, August 11, 2014Popular ReleasesSpace Engineers Server Manager: SESM V1.15: V1.15 - Updated Quartz library - Correct a bug in the new mod managment - Added a warning if you have backup enabled on a server but no static map configuredAspose for Apache POI: Missing Features of Apache POI SS - v 1.2: Release contain the Missing Features in Apache POI SS SDK in comparison with Aspose.Cells What's New ? Following Examples: Create Pivot Charts Detect Merged Cells Sort Data Printing Workbooks Feedback and Suggestions Many more examples are available at Aspose Docs. Raise your queries and suggest more examples via Aspose Forums or via this social coding site.AngularGo (SPA Project Template): AngularGo.VS2013.vsix: First ReleaseTouchmote: Touchmote 1.0 beta 13: Changes Less GPU usage Works together with other Xbox 360 controls Bug fixesPublic Key Infrastructure PowerShell module: PowerShell PKI Module v3.0: Important: I would like to hear more about what you are thinking about the project? I appreciate that you like it (2000 downloads over past 6 months), but may be you have to say something? What do you dislike in the module? Maybe you would love to see some new functionality? Tell, what you think! Installation guide:Use default installation path to install this module for current user only. To install this module for all users — enable "Install for all users" check-box in installation UI ...Modern UI for WPF: Modern UI 1.0.6: The ModernUI assembly including a demo app demonstrating the various features of Modern UI for WPF. BREAKING CHANGE LinkGroup.GroupName renamed to GroupKey NEW FEATURES Improved rendering on high DPI screens, including support for per-monitor DPI awareness available in Windows 8.1 (see also Per-monitor DPI awareness) New ModernProgressRing control with 8 builtin styles New LinkCommands.NavigateLink routed command New Visual Studio project templates 'Modern UI WPF App' and 'Modern UI W...ClosedXML - The easy way to OpenXML: ClosedXML 0.74.0: Multiple thread safe improvements including AdjustToContents XLHelper XLColor_Static IntergerExtensions.ToStringLookup Exception now thrown when saving a workbook with no sheets, instead of creating a corrupt workbook Fix for hyperlinks with non-ASCII Characters Added basic workbook protection Fix for error thrown, when a spreadsheet contained comments and images Fix to Trim function Fix Invalid operation Exception thrown when the formula functions MAX, MIN, and AVG referenc...SEToolbox: SEToolbox 01.042.019 Release 1: Added RadioAntenna broadcast name to ship name detail. Added two additional columns for Asteroid material generation for Asteroid Fields. Added Mass and Block number columns to main display. Added Ellipsis to some columns on main display to reduce name confusion. Added correct SE version number in file when saving. Re-added in reattaching Motor when drag/dropping or importing ships (KeenSH have added RotorEntityId back in after removing it months ago). Added option to export and r...jQuery List DragSort: jQuery List DragSort 0.5.2: Fixed scrollContainer removing deprecated use of $.browser so should now work with latest version of jQuery. Added the ability to return false in dragEnd to revert sort order Project changes Added nuget package for dragsort https://www.nuget.org/packages/dragsort Converted repository from SVN to MercurialBraintree Client Library: Braintree 2.32.0: Allow credit card verification options to be passed outside of the nonce for PaymentMethod.create Allow billingaddress parameters and billingaddress_id to be passed outside of the nonce for PaymentMethod.create Add Subscriptions to paypal accounts Add PaymentMethod.update Add failonduplicatepaymentmethod option to PaymentMethod.create Add support for dispute webhooksThe Mario Kart 8 App: V1.0.2.1: First Codeplex release. WINDOWS INSTALLER ONLYAspose Java for Docx4j: Aspose.Words vs Docx4j - v 1.0: Release contain the Code Comparison for Features in Docx4j SDK and Aspose.Words What's New ?Following Examples: Accessing Document Properties Add Bookmarks Convert to Formats Delete Bookmarks Working with Comments Feedback and Suggestions Many more examples are available at Aspose Docs. Raise your queries and suggest more examples via Aspose Forums or via this social coding site.File System Security PowerShell Module: NTFSSecurity 2.4.1: Add-Access and Remove-Access now take multiple accoutsYourSqlDba: YourSqlDba 5.2.1.: This version improves alert message that comes a while after you install the script. First it says to get it from YourSqlDba.CodePlex.com If you don't want to update now, just-rerun the script from your installed version. To get actual version running just execute install.PrintVersionInfo. . You can go to source code / history and click on change set 72957 to see changes in the script.Manipulator: Manipulator: manipulatorXNB filetype plugin for Paint.NET: Paint.NET XNB plugin v0.4.0.0: CHANGELOG Reverted old incomplete changes. Updated library for compatibility with Paint .NET 4. Updated project to NET 4.5. Updated version to 0.4.0.0. INSTALLATION INSTRUCTIONS Extract the ZIP file to your Paint.NET\FileTypes folder.EdiFabric: Release 4.1: Changed MessageContextWix# (WixSharp) - managed interface for WiX: Release 1.0.0.0: Release 1.0.0.0 Custom UI Custom MSI Dialog Custom CLR Dialog External UIMath.NET Numerics: Math.NET Numerics v3.2.0: Linear Algebra: Vector.Map2 (map2 in F#), storage-optimized Linear Algebra: fix RemoveColumn/Row early index bound check (was not strict enough) Statistics: Entropy ~Jeff Mastry Interpolation: use Array.BinarySearch instead of local implementation ~Candy Chiu Resources: fix a corrupted exception message string Portable Build: support .Net 4.0 as well by using profile 328 instead of 344. .Net 3.5: F# extensions now support .Net 3.5 as well .Net 3.5: NuGet package now contains pro...babelua: 1.6.5.1: V1.6.5.1 - 2014.8.7New feature: Formatting code; Stability improvement: fix a bug that pop up error "System.Net.WebResponse EndGetResponse";New ProjectsDouDou: a little project.Dynamic MVC: Dynamically generate views from your model objects for a data centric MVC application.EasyDb - Simple Data Access: EasyDb is a simple library for data access that allows you to write less code.ExpressToAbroad: just go!!!!!Full Silverlight Web Video/Voice Conferencing: The Goal of this project is to provide complete Open Source (Voice/Video Chatting Client/Server) Modules Using SilverlightGaia: Gaia is an app for Windows plataform, Gaia is like Siri and Google Now or Betty but Gaia use only text commands.pxctest: pxctestSTACS: Career Management System for MIT by Team "STACS"StrongWorld: StrongWorld.WebSuiteXevas Tools: Xevas is a professional coders group of 'Nimbuzz'. We make all tools for worldwide users of nimbuzz at free of cost.????????: ????????????????: ???????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ????????????????: ????????????????: ???????????????: ???????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ???????????????: ???????????????: ??????????????: ??????????????: ??????????????: ????????????????: ????????????????: ???????????????: ???????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ???????????????: ???????????????: ??????????????: ??????????????: ??????????????: ???????????????: ???????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ???????????????: ???????????????: ??????????????: ??????????????: ??????????????: ???????????????: ????????????????: ???????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ???????????????: ???????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ???????????????: ???????????????: ??????????????: ??????????????: ??????????????: ??????????????: ????????????????: ????????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ??????????????: ???????????????: ???????????????: ??????????????: ??????????????: ??????????????: ????????????????: ?????????

    Read the article

  • CodePlex Daily Summary for Thursday, September 20, 2012

    CodePlex Daily Summary for Thursday, September 20, 2012Popular ReleasesSiteMap Editor for Microsoft Dynamics CRM 2011: SiteMap Editor (1.1.2020.421): New features: Disable a specific part of SiteMap to keep the data without displaying them in the CRM application. It simply comments XML part of the sitemap (thanks to rboyers for this feature request) Right click an item and click on "Disable" to disable it Items disabled are greyed and a suffix "- disabled" is added Right click an item and click on "Enable" to enable it Refresh list of web resources in the web resources pickerWPF Animated GIF: WPF Animated GIF 1.2.1: Bug fixes 1275: fixed rendering issues when DisposalMethod = 2 or 3AJAX Control Toolkit: September 2012 Release: AJAX Control Toolkit Release Notes - September 2012 Release Version 60919September 2012 release of the AJAX Control Toolkit. AJAX Control Toolkit .NET 4.5 – AJAX Control Toolkit for .NET 4.5 and sample site (Recommended). AJAX Control Toolkit .NET 4 – AJAX Control Toolkit for .NET 4 and sample site (Recommended). AJAX Control Toolkit .NET 3.5 – AJAX Control Toolkit for .NET 3.5 and sample site (Recommended). Notes: - The current version of the AJAX Control Toolkit is not compatible with ...Lib.Web.Mvc & Yet another developer blog: Lib.Web.Mvc 6.1.0: Lib.Web.Mvc is a library which contains some helper classes for ASP.NET MVC such as strongly typed jqGrid helper, XSL transformation HtmlHelper/ActionResult, FileResult with range request support, custom attributes and more. Release contains: Lib.Web.Mvc.dll with xml documentation file Standalone documentation in chm file and change log Library source code Sample application for strongly typed jqGrid helper is available here. Sample application for XSL transformation HtmlHelper/ActionRe...Sense/Net CMS - Enterprise Content Management: SenseNet 6.1.2 Community Edition: Sense/Net 6.1.2 Community EditionMain new featuresOur current release brings a lot of bugfixes, including the resolution of js/css editing cache issues, xlsx file handling from Office, expense claim demo workspace fixes and much more. Besides fixes 6.1.2 introduces workflow start options and other minor features like a reusable Reject client button for approval scenarios and resource editor enhancements. We have also fixed an issue with our install package to bring you a flawless installation...WinRT XAML Toolkit: WinRT XAML Toolkit - 1.2.3: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features AsyncUI extensions Controls and control extensions Converters Debugging helpers Imaging IO helpers VisualTree helpers Samples Recent changes NOTE: Namespace changes DebugConsol...Python Tools for Visual Studio: 1.5 RC: PTVS 1.5RC Available! We’re pleased to announce the release of Python Tools for Visual Studio 1.5 RC. Python Tools for Visual Studio (PTVS) is an open-source plug-in for Visual Studio which supports programming with the Python language. PTVS supports a broad range of features including CPython/IronPython, Edit/Intellisense/Debug/Profile, Cloud, HPC, IPython, etc. support. The primary new feature for the 1.5 release is Django including Azure support! The http://www.djangoproject.com is a pop...Launchbar: Lanchbar 4.0.0: First public release.AssaultCube Reloaded: 2.5.4 -: Linux has Ubuntu 11.10 32-bit precompiled binaries and Ubuntu 10.10 64-bit precompiled binaries, but you can compile your own as it also contains the source. If you are using Mac or other operating systems, please wait while we try to package for those OSes. Try to compile it. If it fails, download a virtual machine. The server pack is ready for both Windows and Linux, but you might need to compile your own for Linux (source included) Changelog: New logo Improved airstrike! Reset nukes...Extended WPF Toolkit: Extended WPF Toolkit - 1.7.0: Want an easier way to install the Extended WPF Toolkit?The Extended WPF Toolkit is available on Nuget. What's new in the 1.7.0 Release?New controls Zoombox Pie New features / bug fixes PropertyGrid.ShowTitle property added to allow showing/hiding the PropertyGrid title. Modifications to the PropertyGrid.EditorDefinitions collection will now automatically be applied to the PropertyGrid. Modifications to the PropertyGrid.PropertyDefinitions collection will now be reflected automaticaly...JayData - The cross-platform HTML5 data-management library for JavaScript: JayData 1.2: JayData is a unified data access library for JavaScript to CRUD + Query data from different sources like OData, MongoDB, WebSQL, SqLite, Facebook or YQL. The library can be integrated with Knockout.js or Sencha Touch 2 and can be used on Node.js as well. See it in action in this 6 minutes video Sencha Touch 2 example app using JayData: Netflix browser. What's new in JayData 1.2 For detailed release notes check the release notes. JayData core: all async operations now support promises JayDa...????????API for .Net SDK: SDK for .Net ??? Release 4: 2012?9?17??? ?????,???????????????。 ?????Release 3??????,???????,???,??? ??????????????????SDK,????????。 ??,??????? That's all.VidCoder: 1.4.0 Beta: First Beta release! Catches up to HandBrake nightlies with SVN 4937. Added PGS (Blu-ray) subtitle support. Additional framerates available: 30, 50, 59.94, 60 Additional sample rates available: 8, 11.025, 12 and 16 kHz Additional higher bitrates available for audio. Same as Source Constant Framerate available. Added Apple TV 3 preset. Added new Bob deinterlacing option. Introduced process isolation for encodes. Now if HandBrake crashes, VidCoder will keep running and continue pro...DNN Metro7 style Skin package: Metro7 style Skin for DotNetNuke 06.02.01: Stabilization release fixed this issues: Links not worked on FF, Chrome and Safari Modified packaging with own manifest file for install and source package. Moved the user Image on the Login to the left side. Moved h2 font-size to 24px. Note : This release Comes w/o source package about we still work an a solution. Who Needs the Visual Studio source files please go to source and download it from there. Known 16 CSS issues that related to the skin.css. All others are DNN default o...Visual Studio Icon Patcher: Version 1.5.1: This fixes a bug in the 1.5 release where it would crash when no language packs were installed for VS2010.VFPX: Desktop Alerts 1.0.2: This update for the Desktop Alerts contains changes to behavior for setting custom sounds for alerts. I have removed ALERTWAV.TXT from the project, and also removed DA_DEFAULTSOUND from the VFPALERT.H file. The AlertManager class and Alert class both have a "default" cSound of ADDBS(JUSTPATH(_VFP.ServerName))+"alert.wav" --- so, as long as you distribute a sound file with the file name "alert.wav" along with the EXE, that file will be used. You can set your own sound file globally by setti...MCEBuddy 2.x: MCEBuddy 2.2.15: Changelog for 2.2.15 (32bit and 64bit) 1. Added support for %originalfilepath% to get the source file full path. Used for custom commands only. 2. Added support for better parsing of Media Portal XML files to extract ShowName and Episode Name and download additional details from TVDB (like Season No, Episode No etc). 3. Added support for TVDB seriesID in metadata 4. Added support for eMail non blocking UI testEmmaClient - Liveresults for Orienteering: EmmaClient 2012-09-13: Minor release with a small fix for producing OS2012 results (and status of runners in the forest)Multiple Image choice custom field type: MultipleImageUpload V1.0: This is the Custom field type which allows the users to choose image as a choice field. This custom field type is SharePoint 2010, install the WSP thru powershell or Stsadm tool and enjoy the functionality...MDS Administration: Version 1.1.3: Fixed Rename issueNew Projects3dxia: bug3dxiaBitbucket Issue Tracker: A simple issue-tracking Windows client for your projects hosted on bitbucket.org.C++ thread-safe logging: Visual Studio C++ log library project: add to your project for thread-safe logging capabilities.Caddies GeoNote: The work started from making a vision for a neighbourhood communication platform, and ended up in creating the version 1.0 of a mobile application – GeoNotes – CodePlexGitHookForAzure: TestCommerce Server Pipeline Log Analyzer: This tool read and analyze pipeline logs under one selected folder. It applies to Microsoft Commerce Server 2002, 2007, 2009 and 2009 R2 Pipeline logs.Contrib.Mod.ResetPassword: Send reset link as a shapeContrib.Taxonomies.ViewExtension: Orchard module that adds a filter box to the taxonomies selector.EasierRdp: This is a remote desktop session management tool which provides an easy way to maintain multiple users and servers' connectionEconomic news grabber: WCF service for get news from rss, news sites and etc. WPF client for presentation this data for end users.Eticaret Sitesi: eee ticFacebook Graph API SDK Helper Class Library: Facebook C# Graph API SDK Helper Class Under developmentfxch01v14: helloKarned 2: Karned est un carnet de pêche informatique. Ce logiciel permet de noter vos prises de pêche à des fins d'analyse, ou simplement pour le souvenir...lixotrash: SandBox and POCs collections, not interesting hereLoggerLib: The project is a "Tracing Library" developed in a Borland C++ enviroment. Il progetto consiste in una libreria di tracciamento, sviluppata in ambiente Borland.LyncTalker: A simple tray application which will speak incoming Lync instant messages.MicroFrameWork: MicroFrameWorkNuzzle: 2.6.5 Dofus EmulatorPDF Merge: PDF Merge is a simple user-friendly application that allows you to merge multiple PDF documents including scanned / imported documents and images into 1 PDF.Pipeline: A library of several lightweight pipeline implementations ("pipes and filters" pattern).Prime Calculator: PrimeCalculator factorizes a number or a math expression into its prime factors or if prime display its prime type [Unit, Prime, Additive, Pure].Racing: not ready yetRuntime DataSet/DataTable viewer: This component basically allows you to inspect the contents of any Data Set or a Data Table at runtime without breaking into the debugger again and again.Service billing: Student group work for the College of West Anglia UCWA. Snake!: A Snake game written in C#SoccerBot: this is just a test projectSQL Server Trace File Import Utility: Command-line utility to import trace files into a data warehouse type structure. Currently it only handles Login events.testscenairo7onv14: helloToQueryString: Serialize any object in C# to a query string with the .ToQueryString() extension method. Supports primitives, strings, arrays and collections.tyajz: tyajz projectWindows Azure Table Storage: you can find all the details in my blog: hhaggan.wordpress.com and if you do have any question or inquiries feel free to contact me at hhaggan@hotmail.comwtcms: wtcms

    Read the article

  • DBA Best Practices - A Blog Series: Episode 1 - Backups

    - by Argenis
      This blog post is part of the DBA Best Practices series, on which various topics of concern for daily database operations are discussed. Your feedback and comments are very much welcome, so please drop by the comments section and be sure to leave your thoughts on the subject. Morning Coffee When I was a DBA, the first thing I did when I sat down at my desk at work was checking that all backups had completed successfully. It really was more of a ritual, since I had a dual system in place to check for backup completion: 1) the scheduled agent jobs to back up the databases were set to alert the NOC in failure, and 2) I had a script run from a central server every so often to check for any backup failures. Why the redundancy, you might ask. Well, for one I was once bitten by the fact that database mail doesn't work 100% of the time. Potential causes for failure include issues on the SMTP box that relays your server email, firewall problems, DNS issues, etc. And so to be sure that my backups completed fine, I needed to rely on a mechanism other than having the servers do the taking - I needed to interrogate the servers and ask each one if an issue had occurred. This is why I had a script run every so often. Some of you might have monitoring tools in place like Microsoft System Center Operations Manager (SCOM) or similar 3rd party products that would track all these things for you. But at that moment, we had no resort but to write our own Powershell scripts to do it. Now it goes without saying that if you don't have backups in place, you might as well find another career. Your most sacred job as a DBA is to protect the data from a disaster, and only properly safeguarded backups can offer you peace of mind here. "But, we have a cluster...we don't need backups" Sadly I've heard this line more than I would have liked to. You need to understand that a cluster is comprised of shared storage, and that is precisely your single point of failure. A cluster will protect you from an issue at the Operating System level, and also under an outage of any SQL-related service or dependent devices. But it will most definitely NOT protect you against corruption, nor will it protect you against somebody deleting data from a table - accidentally or otherwise. Backup, fine. How often do I take a backup? The answer to this is something you will hear frequently when working with databases: it depends. What does it depend on? For one, you need to understand how much data your business is willing to lose. This is what's called Recovery Point Objective, or RPO. If you don't know how much data your business is willing to lose, you need to have an honest and realistic conversation about data loss expectations with your customers, internal or external. From my experience, their first answer to the question "how much data loss can you withstand?" will be "zero". In that case, you will need to explain how zero data loss is very difficult and very costly to achieve, even in today's computing environments. Do you want to go ahead and take full backups of all your databases every hour, or even every day? Probably not, because of the impact that taking a full backup can have on a system. That's what differential and transaction log backups are for. Have I answered the question of how often to take a backup? No, and I did that on purpose. You need to think about how much time you have to recover from any event that requires you to restore your databases. This is what's called Recovery Time Objective. Again, if you go ask your customer how long of an outage they can withstand, at first you will get a completely unrealistic number - and that will be your starting point for discussing a solution that is cost effective. The point that I'm trying to get across is that you need to have a plan. This plan needs to be practiced, and tested. Like a football playbook, you need to rehearse the moves you'll perform when the time comes. How often is up to you, and the objective is that you feel better about yourself and the steps you need to follow when emergency strikes. A backup is nothing more than an untested restore Backups are files. Files are prone to corruption. Put those two together and realize how you feel about those backups sitting on that network drive. When was the last time you restored any of those? Restoring your backups on another box - that, by the way, doesn't have to match the specs of your production server - will give you two things: 1) peace of mind, because now you know that your backups are good and 2) a place to offload your consistency checks with DBCC CHECKDB or any of the other DBCC commands like CHECKTABLE or CHECKCATALOG. This is a great strategy for VLDBs that cannot withstand the additional load created by the consistency checks. If you choose to offload your consistency checks to another server though, be sure to run DBCC CHECKDB WITH PHYSICALONLY on the production server, and if you're using SQL Server 2008 R2 SP1 CU4 and above, be sure to enable traceflags 2562 and/or 2549, which will speed up the PHYSICALONLY checks further - you can read more about this enhancement here. Back to the "How Often" question for a second. If you have the disk, and the network latency, and the system resources to do so, why not backup the transaction log often? As in, every 5 minutes, or even less than that? There's not much downside to doing it, as you will have to clear the log with a backup sooner than later, lest you risk running out space on your tlog, or even your drive. The one drawback to this approach is that you will have more files to deal with at restore time, and processing each file will add a bit of extra time to the entire process. But it might be worth that time knowing that you minimized the amount of data lost. Again, test your plan to make sure that it matches your particular needs. Where to back up to? Network share? Locally? SAN volume? This is another topic where everybody has a favorite choice. So, I'll stick to mentioning what I like to do and what I consider to be the best practice in this regard. I like to backup to a SAN volume, i.e., a drive that actually lives in the SAN, and can be easily attached to another server in a pinch, saving you valuable time - you wouldn't need to restore files on the network (slow) or pull out drives out a dead server (been there, done that, it’s also slow!). The key is to have a copy of those backup files made quickly, and, if at all possible, to a remote target on a different datacenter - or even the cloud. There are plenty of solutions out there that can help you put such a solution together. That right there is the first step towards a practical Disaster Recovery plan. But there's much more to DR, and that's material for a different blog post in this series.

    Read the article

  • Delivering the Integrated Portal Experience!

    - by Michael Snow
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Guest post by Richard Maldonado, Principal Product Manager, Oracle WebCenter Portal Organizations are still struggling to standardize on a user interaction platform which can meet the needs of all their target audiences.  This has not only resulted in inefficient and inconsistent experiences for their users, but it also creates inefficiencies (productivity and costs) for the departments that manage the applications and information systems.  Portals have historically been the unifying platform that provide IT with a common interface which can securely surface the most relevant interactions for a given user and/or group of users.  However, organizations have found that the technologies available have either not provided the flexibility necessary to address all of their use cases, or they rely too much on IT resources to manage, maintain, and evolve.  Empowering  the Business Groups The core issue that IT departments face with delivering portal experiences is having enough resources to respond and address the influx of requirements which come in from the business.  Commonly, when a business group wants a new portal site established for their group, they will submit a request to the IT dept, the IT dept then assigns a resource to an administrator and/or developer to build.  Unfortunately, this approach is not scalable, it can be a time consuming activity which requires significant interaction between the business owner and the IT resource.  A modern user interaction platforms should empower the business groups by providing them tools which they can use to build and manage the portal experiences without the need for IT's involvement.  And because business groups rarely have technical resources (developers) on staff, the tools must be easy enough that virtually any business user could use.  In addition, the tool must be powerful enough to allow them to build the experience that they need, things such as creating a whole new portal, add/manage page and page hierarchy, manage user/group access, add/modify components within the page, etc.  This balance between ease-of-use and flexibility is key to the successful adoption of tools which will ultimately reduce the burden on IT, respond to the needs of the business, and deliver high-value experiences for the users.  Ready or Not, Here They Come: Smartphones and Tablets Recently, several studies have highlighted that smartphone and tablet-style devices have overtaken PC's in both sales and usage.  This shift is further driving organizations to revaluate how they're delivering data, information, and applications to their users.  Users are expecting to get the same level of access and interaction, but in a ways which are optimized for the capabilities of the device that they are using.  Expect More With the ever growing number of new IT projects and flat/shrinking budgets, organizations are looking for comprehensive solutions which can deliver integrated web experiences that are tailored for the users and optimized for mobile devices.  Piecing together a number of point solutions is no longer an option.  A modern portal technology should not only address the traditional needs of integrating and surfacing back-end applications/information, but it should enable the business through easy-to-use tools and accelerate the delivery of mobile optimized experiences.   v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} 12.00 Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} 12.00 Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} WebCenter in Action Series: Qualcomm Provides a Seamless Experience for Customers with Oracle WebCenter Featuring Qualcomm & Keste 12.00 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 -"/ /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:Calibri; mso-fareast-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} 12.00 Normal 0 false false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-fareast- mso-bidi-font-family:"Times New Roman";}

    Read the article

  • Cancelling Route Navigation in AngularJS Controllers

    - by dwahlin
    If you’re new to AngularJS check out my AngularJS in 60-ish Minutes video tutorial or download the free eBook. Also check out The AngularJS Magazine for up-to-date information on using AngularJS to build Single Page Applications (SPAs). Routing provides a nice way to associate views with controllers in AngularJS using a minimal amount of code. While a user is normally able to navigate directly to a specific route, there may be times when a user triggers a route change before they’ve finalized an important action such as saving data. In these types of situations you may want to cancel the route navigation and ask the user if they’d like to finish what they were doing so that their data isn’t lost. In this post I’ll talk about a technique that can be used to accomplish this type of routing task.   The $locationChangeStart Event When route navigation occurs in an AngularJS application a few events are raised. One is named $locationChangeStart and the other is named $routeChangeStart (there are other events as well). At the current time (version 1.2) the $routeChangeStart doesn’t provide a way to cancel route navigation, however, the $locationChangeStart event can be used to cancel navigation. If you dig into the AngularJS core script you’ll find the following code that shows how the $locationChangeStart event is raised as the $browser object’s onUrlChange() function is invoked:   $browser.onUrlChange(function (newUrl) { if ($location.absUrl() != newUrl) { if ($rootScope.$broadcast('$locationChangeStart', newUrl, $location.absUrl()).defaultPrevented) { $browser.url($location.absUrl()); return; } $rootScope.$evalAsync(function () { var oldUrl = $location.absUrl(); $location.$$parse(newUrl); afterLocationChange(oldUrl); }); if (!$rootScope.$$phase) $rootScope.$digest(); } }); .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } The key part of the code is the call to $broadcast. This call broadcasts the $locationChangeStart event to all child scopes so that they can be notified before a location change is made. To handle the $locationChangeStart event you can use the $rootScope.on() function. For this example I’ve added a call to $on() into a function that is called immediately after the controller is invoked:   function init() { //initialize data here.. //Make sure they're warned if they made a change but didn't save it //Call to $on returns a "deregistration" function that can be called to //remove the listener (see routeChange() for an example of using it) onRouteChangeOff = $rootScope.$on('$locationChangeStart', routeChange); } This code listens for the $locationChangeStart event and calls routeChange() when it occurs. The value returned from calling $on is a “deregistration” function that can be called to detach from the event. In this case the deregistration function is named onRouteChangeOff (it’s accessible throughout the controller). You’ll see how the onRouteChangeOff function is used in just a moment.   Cancelling Route Navigation The routeChange() callback triggered by the $locationChangeStart event displays a modal dialog similar to the following to prompt the user:     Here’s the code for routeChange(): function routeChange(event, newUrl) { //Navigate to newUrl if the form isn't dirty if (!$scope.editForm.$dirty) return; var modalOptions = { closeButtonText: 'Cancel', actionButtonText: 'Ignore Changes', headerText: 'Unsaved Changes', bodyText: 'You have unsaved changes. Leave the page?' }; modalService.showModal({}, modalOptions).then(function (result) { if (result === 'ok') { onRouteChangeOff(); //Stop listening for location changes $location.path(newUrl); //Go to page they're interested in } }); //prevent navigation by default since we'll handle it //once the user selects a dialog option event.preventDefault(); return; } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Looking at the parameters of routeChange() you can see that it accepts an event object and the new route that the user is trying to navigate to. The event object is used to prevent navigation since we need to prompt the user before leaving the current view. Notice the call to event.preventDefault() at the end of the function. The modal dialog is shown by calling modalService.showModal() (see my previous post for more information about the custom modalService that acts as a wrapper around Angular UI Bootstrap’s $modal service). If the user selects “Ignore Changes” then their changes will be discarded and the application will navigate to the route they intended to go to originally. This is done by first detaching from the $locationChangeStart event by calling onRouteChangeOff() (recall that this is the function returned from the call to $on()) so that we don’t get stuck in a never ending cycle where the dialog continues to display when they click the “Ignore Changes” button. A call is then made to $location.path(newUrl) to handle navigating to the target view. If the user cancels the operation they’ll stay on the current view. Conclusion The key to canceling routes is understanding how to work with the $locationChangeStart event and cancelling it so that route navigation doesn’t occur. I’m hoping that in the future the same type of task can be done using the $routeChangeStart event but for now this code gets the job done. You can see this code in action in the Customer Manager application available on Github (specifically the customerEdit view). Learn more about the application here.

    Read the article

  • Windows Azure – Write, Run or Use Software

    - by BuckWoody
    Windows Azure is a platform that has you covered, whether you need to write software, run software that is already written, or Install and use “canned” software whether you or someone else wrote it. Like any platform, it’s a set of tools you can use where it makes sense to solve a problem. The primary location for Windows Azure information is located at http://windowsazure.com. You can find everything there from the development kits for writing software to pricing, licensing and tutorials on all of that. I have a few links here for learning to use Windows Azure – although it’s best if you focus not on the tools, but what you want to solve. I’ve got it broken down here into various sections, so you can quickly locate things you want to know. I’ll include resources here from Microsoft and elsewhere – I use these same resources in the Architectural Design Sessions (ADS) I do with my clients worldwide. Write Software Also called “Platform as a Service” (PaaS), Windows Azure has lots of components you can use together or separately that allow you to write software in .NET or various Open Source languages to work completely online, or in partnership with code you have on-premises or both – even if you’re using other cloud providers. Keep in mind that all of the features you see here can be used together, or independently. For instance, you might only use a Web Site, or use Storage, but you can use both together. You can access all of these components through standard REST API calls, or using our Software Development Kit’s API’s, which are a lot easier. In any case, you simply use Visual Studio, Eclipse, Cloud9 IDE, or even a text editor to write your code from a Mac, PC or Linux.  Components you can use: Azure Web Sites: Windows Azure Web Sites allow you to quickly write an deploy websites, without setting a Virtual Machine, installing a web server or configuring complex settings. They work alone, with other Windows Azure Web Sites, or with other parts of Windows Azure. Web and Worker Roles: Windows Azure Web Roles give you a full stateless computing instance with Internet Information Services (IIS) installed and configured. Windows Azure Worker Roles give you a full stateless computing instance without Information Services (IIS) installed, often used in a "Services" mode. Scale-out is achieved either manually or programmatically under your control. Storage: Windows Azure Storage types include Blobs to store raw binary data, Tables to use key/value pair data (like NoSQL data structures), Queues that allow interaction between stateless roles, and a relational SQL Server database. Other Services: Windows Azure has many other services such as a security mechanism, a Cache (memcacheD compliant), a Service Bus, a Traffic Manager and more. Once again, these features can be used with a Windows Azure project, or alone based on your needs. Various Languages: Windows Azure supports the .NET stack of languages, as well as many Open-Source languages like Java, Python, PHP, Ruby, NodeJS, C++ and more.   Use Software Also called “Software as a Service” (SaaS) this often means consumer or business-level software like Hotmail or Office 365. In other words, you simply log on, use the software, and log off – there’s nothing to install, and little to even configure. For the Information Technology professional, however, It’s not quite the same. We want software that provides services, but in a platform. That means we want things like Hadoop or other software we don’t want to have to install and configure.  Components you can use: Kits: Various software “kits” or packages are supported with just a few clicks, such as Umbraco, Wordpress, and others. Windows Azure Media Services: Windows Azure Media Services is a suite of services that allows you to upload media for encoding, processing and even streaming – or even one or more of those functions. We can add DRM and even commercials to your media if you like. Windows Azure Media Services is used to stream large events all the way down to small training videos. High Performance Computing and “Big Data”: Windows Azure allows you to scale to huge workloads using a few clicks to deploy Hadoop Clusters or the High Performance Computing (HPC) nodes, accepting HPC Jobs, Pig and Hive Jobs, and even interfacing with Microsoft Excel. Windows Azure Marketplace: Windows Azure Marketplace offers data and programs you can quickly implement and use – some free, some for-fee.   Run Software Also known as “Infrastructure as a Service” (IaaS), this offering allows you to build or simply choose a Virtual Machine to run server-based software.  Components you can use: Persistent Virtual Machines: You can choose to install Windows Server, Windows Server with Active Directory, with SQL Server, or even SharePoint from a pre-configured gallery. You can configure your own server images with standard Hyper-V technology and load them yourselves – and even bring them back when you’re done. As a new offering, we also even allow you to select various distributions of Linux – a first for Microsoft. Windows Azure Connect: You can connect your on-premises networks to Windows Azure Instances. Storage: Windows Azure Storage can be used as a remote backup, a hybrid storage location and more using software or even hardware appliances.   Decision Matrix With all of these options, you can use Windows Azure to solve just about any computing problem. It’s often hard to know when to use something on-premises, in the cloud, and what kind of service to use. I’ve used a decision matrix in the last couple of years to take a particular problem and choose the proper technology to solve it. It’s all about options – there is no “silver bullet”, whether that’s Windows Azure or any other set of functions. I take the problem, decide which particular component I want to own and control – and choose the column that has that box darkened. For instance, if I have to control the wiring for a solution (a requirement in some military and government installations), that means the “Networking” component needs to be dark, and so I select the “On Premises” column for that particular solution. If I just need the solution provided and I want no control at all, I can look as “Software as a Service” solutions. Security, Pricing, and Other Info  Security: Security is one of the first questions you should ask in any distributed computing environment. We have certification info, coding guidelines and more, even a general “Request for Information” RFI Response already created for you.   Pricing: Are there licenses? How much does this cost? Is there a way to estimate the costs in this new environment? New Features: Many new features were added to Windows Azure - a good roundup of those changes can be found here. Support: Software Support on Virtual Machines, general support.    

    Read the article

  • DBA Best Practices - A Blog Series: Episode 1 - Backups

    - by Argenis
      This blog post is part of the DBA Best Practices series, on which various topics of concern for daily database operations are discussed. Your feedback and comments are very much welcome, so please drop by the comments section and be sure to leave your thoughts on the subject. Morning Coffee When I was a DBA, the first thing I did when I sat down at my desk at work was checking that all backups have completed successfully. It really was more of a ritual, since I had a dual system in place to check for backup completion: 1) the scheduled agent jobs to back up the databases were set to alert the NOC in failure, and 2) I had a script run from a central server every so often to check for any backup failures. Why the redundancy, you might ask. Well, for one I was once bitten by the fact that database mail doesn't work 100% of the time. Potential causes for failure include issues on the SMTP box that relays your server email, firewall problems, DNS issues, etc. And so to be sure that my backups completed fine, I needed to rely on a mechanism other than having the servers do the taking - I needed to interrogate the servers and ask each one if an issue had occurred. This is why I had a script run every so often. Some of you might have monitoring tools in place like Microsoft System Center Operations Manager (SCOM) or similar 3rd party products that would track all these things for you. But at that moment, we had no resort but to write our own Powershell scripts to do it. Now it goes without saying that if you don't have backups in place, you might as well find another career. Your most sacred job as a DBA is to protect the data from a disaster, and only properly safeguarded backups can offer you peace of mind here. "But, we have a cluster...we don't need backups" Sadly I've heard this line more than I would have liked to. You need to understand that a cluster is comprised of shared storage, and that is precisely your single point of failure. A cluster will protect you from an issue at the Operating System level, and also under an outage of any SQL-related service or dependent devices. But it will most definitely NOT protect you against corruption, nor will it protect you against somebody deleting data from a table - accidentally or otherwise. Backup, fine. How often do I take a backup? The answer to this is something you will hear frequently when working with databases: it depends. What does it depend on? For one, you need to understand how much data your business is willing to lose. This is what's called Recovery Point Objective, or RPO. If you don't know how much data your business is willing to lose, you need to have an honest and realistic conversation about data loss expectations with your customers, internal or external. From my experience, their first answer to the question "how much data loss can you withstand?" will be "zero". In that case, you will need to explain how zero data loss is very difficult and very costly to achieve, even in today's computing environments. Do you want to go ahead and take full backups of all your databases every hour, or even every day? Probably not, because of the impact that taking a full backup can have on a system. That's what differential and transaction log backups are for. Have I answered the question of how often to take a backup? No, and I did that on purpose. You need to think about how much time you have to recover from any event that requires you to restore your databases. This is what's called Recovery Time Objective. Again, if you go ask your customer how long of an outage they can withstand, at first you will get a completely unrealistic number - and that will be your starting point for discussing a solution that is cost effective. The point that I'm trying to get across is that you need to have a plan. This plan needs to be practiced, and tested. Like a football playbook, you need to rehearse the moves you'll perform when the time comes. How often is up to you, and the objective is that you feel better about yourself and the steps you need to follow when emergency strikes. A backup is nothing more than an untested restore Backups are files. Files are prone to corruption. Put those two together and realize how you feel about those backups sitting on that network drive. When was the last time you restored any of those? Restoring your backups on another box - that, by the way, doesn't have to match the specs of your production server - will give you two things: 1) peace of mind, because now you know that your backups are good and 2) a place to offload your consistency checks with DBCC CHECKDB or any of the other DBCC commands like CHECKTABLE or CHECKCATALOG. This is a great strategy for VLDBs that cannot withstand the additional load created by the consistency checks. If you choose to offload your consistency checks to another server though, be sure to run DBCC CHECKDB WITH PHYSICALONLY on the production server, and if you're using SQL Server 2008 R2 SP1 CU4 and above, be sure to enable traceflags 2562 and/or 2549, which will speed up the PHYSICALONLY checks further - you can read more about this enhancement here. Back to the "How Often" question for a second. If you have the disk, and the network latency, and the system resources to do so, why not backup the transaction log often? As in, every 5 minutes, or even less than that? There's not much downside to doing it, as you will have to clear the log with a backup sooner than later, lest you risk running out space on your tlog, or even your drive. The one drawback to this approach is that you will have more files to deal with at restore time, and processing each file will add a bit of extra time to the entire process. But it might be worth that time knowing that you minimized the amount of data lost. Again, test your plan to make sure that it matches your particular needs. Where to back up to? Network share? Locally? SAN volume? This is another topic where everybody has a favorite choice. So, I'll stick to mentioning what I like to do and what I consider to be the best practice in this regard. I like to backup to a SAN volume, i.e., a drive that actually lives in the SAN, and can be easily attached to another server in a pinch, saving you valuable time - you wouldn't need to restore files on the network (slow) or pull out drives out a dead server (been there, done that, it’s also slow!). The key is to have a copy of those backup files made quickly, and, if at all possible, to a remote target on a different datacenter - or even the cloud. There are plenty of solutions out there that can help you put such a solution together. That right there is the first step towards a practical Disaster Recovery plan. But there's much more to DR, and that's material for a different blog post in this series.

    Read the article

  • Advice for a distracted, unhappy, recently graduated programmer? [closed]

    - by Re-Invent
    I graduated 4 months ago. I had offers from a few good places to work at. At the same time I wanted to stick to building a small software business of my own, still have some ideas with good potential, some half done projects frozen in my github. But due to social pressures, I chose a job, the pay is great, but I am half-passionate about it. A small team of smart folks building useful product, working out contracts across the world. I've started finding it extremely boring. Boring to the extent that I skip 2-3 days a week together not doing work. Neither do I spend that time progressing any of my own projects. Yes, I feel stupid at the way I'm wasting time, but I don't understand exactly why is it happening. It's as if all the excitement has been drained. What can I do about it? Long version: School - I was in third standard. Only students, 6th grade had access to computer labs. I once peeked into the lab from the little door opening. No hard-disks, MS DOS on 5 1/2 inch floppies. I asked a senior student to play some sound in BASIC. He used PLAY to compose a tune. Boy! I was so excited, I was jumping from within. Back home, asked my brother to teach me some programming. We bought a book "MODERN All About GW-BASIC for Schools & Colleges". The book had everything, right from printing, to taking input, file i/o, game programming, machine level support, etc. I was in 6th standard, wrote my first game - a wheel of fortune, rotated the wheel by manipulating 16 color palette's definition. Got internet soon, got hooked to QuickBasic programming community. Made some more games "007 in Danger", "Car Crush 2" for submission to allbasiccode archives. I was extremely excited about all this. My interests now swayed into "hacking" (computer security). Taught myself some perl, found it annoying, learnt PHP and a bit of SQL. Also taught myself Visual Basic one of the winters and wrote a pacman clone with Direct X. By the time I was in 10th standard, I created some evil tools using visual basic, php and mysql and eventually landed myself into an unpaid side-job at a government facility, building evil tools for them. It was a dream come true for crackers of that time. And so was I, still very excited. Things changed soon, last two years of school were not so great as I was balancing preps for college, work at govt. and studies for school at same time. College - College was opposite of all I had wished it to be. I imagined it to be a place where I'd spend my 4 years building something awesome. It was rather an epitome of rote learning, attendance, rules, busy schedules, ban on personal laptops, hardly any hackers surrounding you and shit like that. We had to take permissions to even introduce some cultural/creative activities in our annual schedule. The labs won't be open on weekends because the lab employees had to have their leaves. Yes, a horrible place for someone like me. I still managed to pull out a project with a friend over 2 months. Showed it to people high in the academia hierarchy. They were immensely impressed, we proposed to allow personal computers for students. They made up half-assed reasons and didn't agree. We felt frustrated. And so on, I still managed to teach myself new languages, do new projects of my own, do an intern at the same govt. facility, start a small business for sometime, give a talk at a conference I'm passionate about, win game-dev and hacking contest at most respected colleges, solve good deal of programming contest problems, etc. At the same time I was not content with all these restrictions, great emphasis on rote learning, and sheer wastage of time due to college. I never felt I was overdoing, but now I feel I burnt myself out. During my last days at college, I did an intern at a bigco. While I spent my time building prototypes for certain LBS, the other interns around me, even a good friend, was just skipping time. I thought maybe, in a few weeks he would put in some serious efforts at work assigned to him, but all he did was to find creative ways to skip work, hide his face from manager, engage people in talks if they try to question his progress, etc. I tried a few time to get him on track, but it seems all he wanted was to "not to work hard at all and still reap the fruits". I don't know how others take such people, but I find their vicinity very very poisonous to one's own motivation and productivity. Over that, the place where I come from, HRs don't give much value to what have you done past 4 years. So towards the end of out intern, we all were offered work at the bigco, but the slacker, even after not writing more than 200 lines of code was made a much better offer. I felt enraged instantly - "Is this how the corp world treats someone who does fruitful, if not extra-ordinary work form them for past 6 months?". Yes, I did try to negotiate and debate. The bigcos seem blind due to departmentalization of responsibilities and many layers of management. I decided not to be in touch with any characters of that depressing play. Probably the busy time I had at college, ignoring friends, ignoring fun and squeezing every bit of free time for myself is also responsible. Probably this is what has drained all my willingness to work for anyone. I find my day job boring, at the same time I with to maintain it for financial reasons. I feel a bit burnt out, unsatisfied and at the same time an urge to quit working for someone else and start finishing my frozen side-projects (which may be profitable). Though I haven't got much to support myself with food, office, internet bills, etc in savings. I still have my day job, but I don't find it very interesting, even though the pay is higher than the slacker, I don't find money to be a great motivator here. I keep comparing myself to my past version. I wonder how to get rid of this and reboot myself back to the way I was in school days - excited about it, tinkering, building, learning new things daily, and NOT BORED?

    Read the article

  • Wireless networks are not detected at start up in Ubuntu 12.04

    - by Kanhaiya Mishra
    I have recently (three four days ago) installed Ubuntu 12.04 via windows installer i.e. wubi.exe. After the installation completed wireless and Ethernet were both working well. But after restart wireless networks didn't show up while in the network manager both networking and wireless were enabled. Though sometimes after boot it did show the networks available but very rarely. So I went through various posts regarding wireless issues in Ubuntu 12.04 and tried so many things but ended up in nothing satisfactory. I have Broadcom 4313 LAN network controller and brcmsmac driver. Then relying on some suggestions I tried to install bcm-wl driver but couldn't install due to some error in jockeyl.log file. Then i tried fresh installation of the same driver but still could resolve the startup issues with wireless. Then again I reinstalled Ubuntu inside windows using wubi installer. This time again same problem occurred after boot. But this time I successfully installed wl driver before disturbing file-system files of Ubuntu. But again the same issue. This time I noticed some new things: If I inserted Ethernet/LAN cable before startup then wireless networks are available and of course LAN(wired) networks also work. but if i don't plug in cable before startup and then plug it after startup then it didn't detect Ethernet network neither wireless. So I haven't noticed it before that LAN along with wifi also doesn't work after startup. But if i suspend the session and make it sleep and again login then it worked. I tried it every time that WLAN worked perfectly. But still i m unable to resolve that startup problem. Each time i boot first I have to suspend it once then only networks are available. It irritates me each time i reboot/boot my lappy. So please help out of this problem. Any ideas/help regarding this issue would be highly appreciated. Some of the commands that i run gave following results: # lspci 00:00.0 Host bridge: Intel Corporation Core Processor DRAM Controller (rev 12) 00:02.0 VGA compatible controller: Intel Corporation Core Processor Integrated Graphics Controller (rev 12) 00:16.0 Communication controller: Intel Corporation 5 Series/3400 Series Chipset HECI Controller (rev 06) 00:1a.0 USB controller: Intel Corporation 5 Series/3400 Series Chipset USB2 Enhanced Host Controller (rev 06) 00:1b.0 Audio device: Intel Corporation 5 Series/3400 Series Chipset High Definition Audio (rev 06) 00:1c.0 PCI bridge: Intel Corporation 5 Series/3400 Series Chipset PCI Express Root Port 1 (rev 06) 00:1c.1 PCI bridge: Intel Corporation 5 Series/3400 Series Chipset PCI Express Root Port 2 (rev 06) 00:1c.5 PCI bridge: Intel Corporation 5 Series/3400 Series Chipset PCI Express Root Port 6 (rev 06) 00:1d.0 USB controller: Intel Corporation 5 Series/3400 Series Chipset USB2 Enhanced Host Controller (rev 06) 00:1e.0 PCI bridge: Intel Corporation 82801 Mobile PCI Bridge (rev a6) 00:1f.0 ISA bridge: Intel Corporation Mobile 5 Series Chipset LPC Interface Controller (rev 06) 00:1f.2 SATA controller: Intel Corporation 5 Series/3400 Series Chipset 6 port SATA AHCI Controller (rev 06) 00:1f.3 SMBus: Intel Corporation 5 Series/3400 Series Chipset SMBus Controller (rev 06) 00:1f.6 Signal processing controller: Intel Corporation 5 Series/3400 Series Chipset Thermal Subsystem (rev 06) 03:00.0 Network controller: Broadcom Corporation BCM4313 802.11b/g/n Wireless LAN Controller (rev 01) 04:00.0 Ethernet controller: Atheros Communications Inc. AR8152 v1.1 Fast Ethernet (rev c1) ff:00.0 Host bridge: Intel Corporation Core Processor QuickPath Architecture Generic Non-core Registers (rev 02) ff:00.1 Host bridge: Intel Corporation Core Processor QuickPath Architecture System Address Decoder (rev 02) ff:02.0 Host bridge: Intel Corporation Core Processor QPI Link 0 (rev 02) ff:02.1 Host bridge: Intel Corporation Core Processor QPI Physical 0 (rev 02) ff:02.2 Host bridge: Intel Corporation Core Processor Reserved (rev 02) ff:02.3 Host bridge: Intel Corporation Core Processor Reserved (rev 02) # sudo lshw -C network *-network description: Wireless interface product: BCM4313 802.11b/g/n Wireless LAN Controller vendor: Broadcom Corporation physical id: 0 bus info: pci@0000:03:00.0 logical name: eth1 version: 01 serial: 70:f1:a1:49:b6:ab width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=wl0 driverversion=5.100.82.38 ip=192.168.1.7 latency=0 multicast=yes wireless=IEEE 802.11 resources: irq:17 memory:f0500000-f0503fff *-network description: Ethernet interface product: AR8152 v1.1 Fast Ethernet vendor: Atheros Communications Inc. physical id: 0 bus info: pci@0000:04:00.0 logical name: eth0 version: c1 serial: b8:ac:6f:6b:f7:4a capacity: 100Mbit/s width: 64 bits clock: 33MHz capabilities: pm msi pciexpress vpd bus_master cap_list ethernet physical tp 10bt 10bt-fd 100bt 100bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=atl1c driverversion=1.0.1.0-NAPI firmware=N/A latency=0 link=no multicast=yes port=twisted pair resources: irq:44 memory:f0400000-f043ffff ioport:2000(size=128) # lsmod | grep wl wl 2568210 0 lib80211 14381 2 lib80211_crypt_tkip,wl # sudo iwlist eth1 scanning eth1 Scan completed : Cell 01 - Address: 30:46:9A:85:DA:9A ESSID:"BH DASHIR 2" Mode:Managed Frequency:2.462 GHz (Channel 11) Quality:4/5 Signal level:-60 dBm Noise level:-98 dBm IE: IEEE 802.11i/WPA2 Version 1 Group Cipher : CCMP Pairwise Ciphers (1) : CCMP Authentication Suites (1) : PSK IE: Unknown: DD7F0050F204104A00011010440001021041000100103B000103104700109AFE7D908F8E2D381860668BA2E8D8771021000D4E4554474541522C20496E632E10230009574752363134763130102400095747523631347631301042000538333235381054000800060050F204000110110009574752363134763130100800020084 Encryption key:on Bit Rates:1 Mb/s; 2 Mb/s; 5.5 Mb/s; 11 Mb/s; 18 Mb/s 24 Mb/s; 36 Mb/s; 54 Mb/s; 6 Mb/s; 9 Mb/s 12 Mb/s; 48 Mb/s Cell 02 - Address: C0:3F:0E:EB:45:14 ESSID:"BH DASHIR 3" Mode:Managed Frequency:2.462 GHz (Channel 11) Quality:2/5 Signal level:-71 dBm Noise level:-98 dBm IE: IEEE 802.11i/WPA2 Version 1 Group Cipher : CCMP Pairwise Ciphers (1) : CCMP Authentication Suites (1) : PSK IE: Unknown: DD7F0050F204104A00011010440001021041000100103B00010310470010F3C9BBE499D140540F530E7EBEDE2F671021000D4E4554474541522C20496E632E10230009574752363134763130102400095747523631347631301042000538333235381054000800060050F204000110110009574752363134763130100800020084 Encryption key:on Bit Rates:1 Mb/s; 2 Mb/s; 5.5 Mb/s; 11 Mb/s; 18 Mb/s 24 Mb/s; 36 Mb/s; 54 Mb/s; 6 Mb/s; 9 Mb/s 12 Mb/s; 48 Mb/s Cell 03 - Address: A0:21:B7:A8:2F:C0 ESSID:"BH DASHIR 4" Mode:Managed Frequency:2.422 GHz (Channel 3) Quality:1/5 Signal level:-86 dBm Noise level:-98 dBm IE: IEEE 802.11i/WPA2 Version 1 Group Cipher : CCMP Pairwise Ciphers (1) : CCMP Authentication Suites (1) : PSK IE: Unknown: DD8B0050F204104A0001101044000102103B0001031047001000000000000010000000A021B7A82FC01021000D4E6574676561722C20496E632E10230009574E523130303076321024000456324831104200046E6F6E651054000800060050F20400011011001B574E5231303030763228576972656C6573732041502D322E344729100800020086103C000103 Encryption key:on Bit Rates:1 Mb/s; 2 Mb/s; 5.5 Mb/s; 11 Mb/s; 6 Mb/s 9 Mb/s; 12 Mb/s; 18 Mb/s; 24 Mb/s; 36 Mb/s 48 Mb/s; 54 Mb/s

    Read the article

  • Benefits of Behavior Driven Development

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2013/07/26/benefits-of-behavior-driven-development.aspxContinuing my previous article on BDD, I wanted to point out some benefits of BDD and since BDD is an extension of Test Driven Development (TDD), you get those as well. I’ll add another article on some possible downsides of this approach. There are many articles about the benefits of TDD and they apply to BDD. I’ve pointed out some here and copied some of the main points for each article, but there are many more including the book The Art of Unit Testing by Roy Osherove. http://geekswithblogs.net/leesblog/archive/2008/04/30/the-benefits-of-test-driven-development.aspx (Lee Brandt) Stability Accountability Design Ability Separated Concerns Progress Indicator http://tddftw.com/benefits-of-tdd/ Help maintainers understand the intention behind the code Bring validation and proper data handling concerns to the forefront. Writing the tests first is fun. Better APIs come from writing testable code. TDD will make you a better developer. http://www.slideshare.net/dhelper/benefit-from-unit-testing-in-the-real-world (from Typemock). Take a look at the slides, especially the extra time required for TDD (slide 10) and the next one of the bugs avoided using TDD (slide 11). Less bugs (slide 11) about testing and development (13) Increase confidence in code (14) Fearlessly change your code (14) Document Requirements (14) also see http://visualstudiomagazine.com/articles/2013/06/01/roc-rocks.aspx Discover usability issues early (14) All these points and articles are great and there are many more. The following are my additions to the benefits of BDD from using it in real projects for my company. July 2013 on MSDN - Behavior-Driven Design with SpecFlow Scott Allen did a very informative TDD and MVC module, but to me he is doing BDDCompile and Execute Requirements in Microsoft .NET ~ Video from TechEd 2012 Communication I was working through a complicated task that the decision tree kept growing. After writing out the Given, When, Then of the scenario, I was able tell QA what I had worked through for their initial test cases. They were able to add from there. It is also useful to use this language with other developers, managers, or clients to help make informed decisions on if it meets the requirements or if it can simplified to save time (money). Thinking through solutions, before starting to code This was the biggest benefit to me. I like to jump into coding to figure out the problem. Many times I don't understand my path well enough and have to do some parts over. A past supervisor told me several times during reviews that I need to get better at seeing "the forest for the trees". When I sit down and write out the behavior that I need to implement, I force myself to think things out further and catch scenarios before they get to QA. A co-worker that is new to BDD and we’ve been using it in our new project for the last 6 months, said “It really clarifies things”. It took him awhile to understand it all, but now he’s seeing the value of this approach (yes there are some downsides, but that is a different issue). Developers’ Confidence This is huge for me. With tests in place, my confidence grows that I won’t break code that I’m not directly changing. In the past, I’ve worked on projects with out tests and we would frequently find regression bugs (or worse the users would find them). That isn’t fun. We don’t catch all problems with the tests, but when QA catches one, I can write a test to make sure it doesn’t happen again. It’s also good for Releasing code, telling your manager that it’s good to go. As time goes on and the code gets older, how confident are you that checking in code won’t break something somewhere else? Merging code - pre release confidence If you’re merging code a lot, it’s nice to have the tests to help ensure you didn’t merge incorrectly. Interrupted work I had a task that I started and planned out, then was interrupted for a month because of different priorities. When I started it up again, and un-shelved my changes, I had the BDD specs and it helped me remember what I had figured out and what was left to do. It would have much more difficult without the specs and tests. Testing and verifying complicated scenarios Sometimes in the UI there are scenarios that get tricky, because there are a lot of steps involved (click here to open the dialog, enter the information, make sure it’s valid, when I click cancel it should do {x}, when I click ok it should close and do {y}, then do this, etc….). With BDD I can avoid some of the mouse clicking define the scenarios and have them re-run quickly, without using a mouse. UI testing is still needed, but this helps a bunch. The same can be true for tricky server logic. Documentation of Assumptions and Specifications The BDD spec tests (Jasmine or SpecFlow or other tool) also work as documentation and show what the original developer was trying to accomplish. It’s not a different Word document, so developers will keep this up to date, instead of letting it become obsolete. What happens if you leave the project (consulting, new job, etc) with no specs or at the least good comments in the code? Sometimes I think of a new scenario, so I add a failing spec and continue in the same stream of thought (don’t forget it because it was on a piece of paper or in a notepad). Then later I can come back and handle it and have it documented. Jasmine tests and JavaScript –> help deal with the non-typed system I like JavaScript, but I also dislike working with JavaScript. I miss C# telling me if a property doesn’t actually exist at build time. I like the idea of TypeScript and hope to use it more in the future. I also use KnockoutJs, which has observables that need to be called with ending (), since the observable is a function. It’s hard to remember when to use () or not and the Jasmine specs/tests help ensure the correct usage.   This should give you an idea of the benefits that I see in using the BDD approach. I’m sure there are more. It talks a lot of practice, investment and experimentation to figure out how to approach this and to get comfortable with it. I agree with Scott Allen in the video I linked above “Remember that TDD can take some practice. So if you're not doing test-driven design right now? You can start and practice and get better. And you'll reach a point where you'll never want to get back.”

    Read the article

  • Oracle HRMS API – Update Employee Assignment

    - by PRajkumar
    To Update Supervisor, Manager Flag, Bargaining Unit, Labour Union Member Flag, Gre, Time Card, Work Schedule, Normal Hours, Frequency, Time Normal Finish, Time Normal Start, Default Code Combination, Set of Books Id API -- hr_assignment_api.update_emp_asg   To Update Grade, Location, Job, Payroll, Organization, Employee Category, People Group API -- hr_assignment_api.update_emp_asg_criteria   Example --   DECLARE    -- Local Variables    -- -----------------------    lc_dt_ud_mode           VARCHAR2(100)    := NULL;    ln_assignment_id       NUMBER                  := 33561;    ln_supervisor_id        NUMBER                  := 2;    ln_object_number       NUMBER                  := 1;    ln_people_group_id  NUMBER                  := 1;      -- Out Variables for Find Date Track Mode API    -- -----------------------------------------------------------------    lb_correction                           BOOLEAN;    lb_update                                 BOOLEAN;    lb_update_override              BOOLEAN;    lb_update_change_insert   BOOLEAN;       -- Out Variables for Update Employee Assignment API    -- ----------------------------------------------------------------------------    ln_soft_coding_keyflex_id       HR_SOFT_CODING_KEYFLEX.SOFT_CODING_KEYFLEX_ID%TYPE;    lc_concatenated_segments       VARCHAR2(2000);    ln_comment_id                             PER_ALL_ASSIGNMENTS_F.COMMENT_ID%TYPE;    lb_no_managers_warning        BOOLEAN;  -- Out Variables for Update Employee Assgment Criteria  -- -------------------------------------------------------------------------------  ln_special_ceiling_step_id                    PER_ALL_ASSIGNMENTS_F.SPECIAL_CEILING_STEP_ID%TYPE;  lc_group_name                                          VARCHAR2(30);  ld_effective_start_date                             PER_ALL_ASSIGNMENTS_F.EFFECTIVE_START_DATE%TYPE;  ld_effective_end_date                              PER_ALL_ASSIGNMENTS_F.EFFECTIVE_END_DATE%TYPE;  lb_org_now_no_manager_warning   BOOLEAN;  lb_other_manager_warning                  BOOLEAN;  lb_spp_delete_warning                          BOOLEAN;  lc_entries_changed_warning                VARCHAR2(30);  lb_tax_district_changed_warn             BOOLEAN;   BEGIN    -- Find Date Track Mode    -- --------------------------------    dt_api.find_dt_upd_modes    (    p_effective_date                  => TO_DATE('12-JUN-2011'),         p_base_table_name            => 'PER_ALL_ASSIGNMENTS_F',         p_base_key_column           => 'ASSIGNMENT_ID',         p_base_key_value               => ln_assignment_id,          -- Output data elements          -- --------------------------------          p_correction                          => lb_correction,          p_update                                => lb_update,          p_update_override              => lb_update_override,          p_update_change_insert   => lb_update_change_insert      );      IF ( lb_update_override = TRUE OR lb_update_change_insert = TRUE )    THEN        -- UPDATE_OVERRIDE        -- ---------------------------------        lc_dt_ud_mode := 'UPDATE_OVERRIDE';    END IF;     IF ( lb_correction = TRUE )   THEN       -- CORRECTION       -- ----------------------      lc_dt_ud_mode := 'CORRECTION';   END IF;     IF ( lb_update = TRUE )   THEN       -- UPDATE       -- --------------       lc_dt_ud_mode := 'UPDATE';    END IF;     -- Update Employee Assignment   -- ---------------------------------------------  hr_assignment_api.update_emp_asg  ( -- Input data elements   -- ------------------------------   p_effective_date                              => TO_DATE('12-JUN-2011'),   p_datetrack_update_mode         => lc_dt_ud_mode,   p_assignment_id                            => ln_assignment_id,   p_supervisor_id                              => NULL,   p_change_reason                           => NULL,   p_manager_flag                              => 'N',   p_bargaining_unit_code              => NULL,   p_labour_union_member_flag   => NULL,   p_segment1                                       => 204,   p_segment3                                       => 'N',   p_normal_hours                              => 10,   p_frequency                                       => 'W',   -- Output data elements   -- -------------------------------   p_object_version_number             => ln_object_number,   p_soft_coding_keyflex_id              => ln_soft_coding_keyflex_id,   p_concatenated_segments             => lc_concatenated_segments,   p_comment_id                                   => ln_comment_id,   p_effective_start_date                      => ld_effective_start_date,   p_effective_end_date                        => ld_effective_end_date,   p_no_managers_warning               => lb_no_managers_warning,   p_other_manager_warning            => lb_other_manager_warning  );    -- Find Date Track Mode for Second API   -- ------------------------------------------------------   dt_api.find_dt_upd_modes   (  p_effective_date                   => TO_DATE('12-JUN-2011'),      p_base_table_name            => 'PER_ALL_ASSIGNMENTS_F',      p_base_key_column           => 'ASSIGNMENT_ID',      p_base_key_value               => ln_assignment_id,      -- Output data elements      -- -------------------------------      p_correction                           => lb_correction,      p_update                                 => lb_update,      p_update_override               => lb_update_override,      p_update_change_insert    => lb_update_change_insert   );     IF ( lb_update_override = TRUE OR lb_update_change_insert = TRUE )   THEN     -- UPDATE_OVERRIDE     -- --------------------------------     lc_dt_ud_mode := 'UPDATE_OVERRIDE';   END IF;      IF ( lb_correction = TRUE )    THEN      -- CORRECTION      -- ----------------------      lc_dt_ud_mode := 'CORRECTION';   END IF;      IF ( lb_update = TRUE )    THEN      -- UPDATE      -- --------------      lc_dt_ud_mode := 'UPDATE';    END IF;    -- Update Employee Assgment Criteria  -- -----------------------------------------------------  hr_assignment_api.update_emp_asg_criteria  ( -- Input data elements   -- ------------------------------   p_effective_date                                   => TO_DATE('12-JUN-2011'),   p_datetrack_update_mode               => lc_dt_ud_mode,   p_assignment_id                                 => ln_assignment_id,   p_location_id                                        => 204,   p_grade_id                                             => 29,   p_job_id                                                  => 16,   p_payroll_id                                          => 52,   p_organization_id                               => 239,   p_employment_category                    => 'FR',   -- Output data elements   -- -------------------------------   p_people_group_id                              => ln_people_group_id,   p_object_version_number                   => ln_object_number,   p_special_ceiling_step_id                  => ln_special_ceiling_step_id,   p_group_name                                        => lc_group_name,   p_effective_start_date                           => ld_effective_start_date,   p_effective_end_date                             => ld_effective_end_date,   p_org_now_no_manager_warning  => lb_org_now_no_manager_warning,   p_other_manager_warning                 => lb_other_manager_warning,   p_spp_delete_warning                         => lb_spp_delete_warning,   p_entries_changed_warning              => lc_entries_changed_warning,   p_tax_district_changed_warning     => lb_tax_district_changed_warn  );    COMMIT; EXCEPTION          WHEN OTHERS THEN                       ROLLBACK;                       dbms_output.put_line(SQLERRM); END; / SHOW ERR;    

    Read the article

  • At times, you need to hire a professional.

    - by Phil Factor
    After months of increasingly demanding toil, the development team I belonged to was told that the project was to be canned and the whole team would be fired.  I’d been brought into the team as an expert in the data implications of a business re-engineering of a major financial institution. Nowadays, you’d call me a data architect, I suppose.  I’d spent a happy year being paid consultancy fees solving a succession of interesting problems until the point when the company lost is nerve, and closed the entire initiative. The IT industry was in one of its characteristic mood-swings downwards.  After the announcement, we met in the canteen. A few developers had scented the smell of death around the project already hand had been applying unsuccessfully for jobs. There was a sense of doom in the mass of dishevelled and bleary-eyed developers. After giving vent to anger and despair, talk turned to getting new employment. It was then that I perked up. I’m not an obvious choice to give advice on getting, or passing,  IT interviews. I reckon I’ve failed most of the job interviews I’ve ever attended. I once even failed an interview for a job I’d already been doing perfectly well for a year. The jobs I’ve got have mostly been from personal recommendation. Paradoxically though, from years as a manager trying to recruit good staff, I know a lot about what IT managers are looking for.  I gave an impassioned speech outlining the important factors in getting to an interview.  The most important thing, certainly in my time at work is the quality of the résumé or CV. I can’t even guess the huge number of CVs (résumés) I’ve read through, scanning for candidates worth interviewing.  Many IT Developers find it impossible to describe their  career succinctly on two sides of paper.  They leave chunks of their life out (were they in prison?), get immersed in detail, put in irrelevancies, describe what was going on at work rather than what they themselves did, exaggerate their importance, criticize their previous employers, aren’t  aware of the important aspects of a role to a potential employer, suffer from shyness and modesty,  and lack any sort of organized perspective of their work. There are many ways of failing to write a decent CV. Many developers suffer from the delusion that their worth can be recognized purely from the code that they write, and shy away from anything that seems like self-aggrandizement. No.  A resume must make a good impression, which means presenting the facts about yourself in a clear and positive way. You can’t do it yourself. Why not have your resume professionally written? A good professional CV Writer will know the qualities being looked for in a CV and interrogate you to winkle them out. Their job is to make order and sense out of a confused career, to summarize in one page a mass of detail that presents to any recruiter the information that’s wanted. To stand back and describe an accurate summary of your skills, and work-experiences dispassionately, without rancor, pity or modesty. You are no more capable of producing an objective documentation of your career than you are of taking your own appendix out.  My next recommendation was more controversial. This is to have a professional image overhaul, or makeover, followed by a professionally-taken photo portrait. I discovered this by accident. It is normal for IT professionals to face impossible deadlines and long working hours by looking more and more like something that had recently blocked a sink. Whilst working in IT, and in a state of personal dishevelment, I’d been offered the role in a high-powered amateur production of an old ex- Broadway show, purely for my singing voice. I was supposed to be the presentable star. When the production team saw me, the air was thick with tension and despair. I was dragged kicking and protesting through a succession of desperate grooming, scrubbing, dressing, dieting. I emerged feeling like “That jewelled mass of millinery, That oiled and curled Assyrian bull, Smelling of musk and of insolence.” (Tennyson Maud; A Monodrama (1855) Section v1 stanza 6) I was then photographed by a professional stage photographer.  When the photographs were delivered, I was amazed. It wasn’t me, but it looked somehow respectable, confident, trustworthy.   A while later, when the show had ended, I took the photos, and used them for work. They went with the CV to job applications. It did the trick better than I could ever imagine.  My views went down big with the developers. Old rivalries were put immediately to one side. We voted, with a show of hands, to devote our energies for the entire notice period to getting employable. We had a team sourcing the CV Writer,  a team organising the make-overs and photographer, and a third team arranging  mock interviews. A fourth team determined the best websites and agencies for recruitment, with the help of friends in the trade.  Because there were around thirty developers, we were in a good negotiating position.  Of the three CV Writers we found who lived locally, one proved exceptional. She was an ex-journalist with an eye to detail, and years of experience in manipulating language. We tried her skills out on a developer who seemed a hopeless case, and he was called to interview within a week.  I was surprised, too, how many companies were experts at image makeovers. Within the month, we all looked like those weird slick  people in the ‘Office-tagged’ stock photographs who stare keenly and interestedly at PowerPoint slides in sleek chromium-plated high-rise offices. The portraits we used still adorn the entries of many of my ex-colleagues in LinkedIn. After a months’ worth of mock interviews, and technical Q&A, our stutters, hesitations, evasions and periphrastic circumlocutions were all gone.  There is little more to relate. With the résumés or CVs, mugshots, and schooling in how to pass interviews, we’d all got new and better-paid jobs well  before our month’s notice was ended. Whilst normally, an IT team under the axe is a sad and depressed place to belong to, this wonderful group of people had proved the power of organized group action in turning the experience to advantage. It left us feeling slightly guilty that we were somehow cheating, but I guess we were merely leveling the playing-field.

    Read the article

  • I'm a contract developer and I think I'm about to get screwed [closed]

    - by kagaku
    I do contract development on the side. You could say that I'm a contract developer? Considering I've only ever had one client I'd say that's not exactly the truth - more like I took a side job and needed some extra cash. It started out as a "rebuild our website and we'll pay you $10k" type project. Once that was complete (a bit over schedule, but certainly not over budget), the company hired me on as a "long term support" contractor. The contract is to go from March of this year, expiring on December 31st of this year - 10 months. Over which a payment is to be paid on the 30th of each month for a set amount. I've been fulfilling my end of the contract on all points - doing server maintenence, application and database changes, doing huge rush changes and pretty much just going above and beyond. Currently I'm in the middle of development of an iPhone mobile application (PhoneGap based) which is nearing completion (probably 3-4 weeks from submission). It has not been all peaches and flowers though. Each and every month when my paycheck comes due, there always seems to be an issue of sorts. These issues did not occur during the initial project, only during the support contract. The actual contract states that my check should be mailed out on the 30th of the month. I have received my check on time approximately once (on time being about 2-3 days within the 30th). I've received my paycheck as late as the 15th of the next month - over two weeks late. I've put up with it because I need the paycheck. There have been promises and promises of "we'll send it out on time next time! I promise" - only to receive it just as late the next month. When I ask about payment they give me a vibe like "why are you only worried about money?" - unfortunately I don't have the luxury of not worrying about money. The last straw was with my August payment, which should have been mailed on August 30th. I received it on September 12th. The reason for the delay? "USPS is delaying it man! we sent it out on the 1st!" is the reason I got. When I finally got the check in the mail, the postage on the envelope was marked September 10th - the date it was run through the postage machine. I've been outright lied to, at this point. I carry on working, because again - I need a paycheck. I orchestrated the move of our application to a new server, developed a bunch of new changes and continued work on the iPhone app. All told I probably went over my hourly allotment (I'm paid for 40 hours a month, I probably put in at least 50). On Saturday, the 1st, I gave the main contact at the company (a company of 3, by the way - this is not some big corporation) a ring and filled him in on the status of my work for the past two weeks. Unusually I hadn't heard from him since the middle of September. His response was "oh... well, that is nice and uh.. good job. well, we've been talking within the company about things and we've certainly got some decisions ahead of us..." - not verbatim but you get the idea (I hope?). I got out of this conversation that the site is not doing very well (which it's not) and they're considering pulling the plug. Crap, this contract is going to end early - there goes Christmas! Fine, that's alright, no problem. I'll get paid for the last months work and call it a day. Unfortunately I still haven't gotten last months check, and I'm getting dicked around now. "Oh.. we had problems transferring funds, we'll try and mail it out tomorrow" and "I left a VM with the finance guy, but I can't get ahold of him". So I'm getting the feeling I'm not getting paid for all the work I put in for September. This is obviously breach of contract, and I am pissed. Thinking irrationally, I considered changing all their passwords and holding their stuff hostage. Before I think it through (by the way, I am NOT going to do this, realized it would probably get me in trouble), I go and try some passwords for our various accounts. Google Apps? Oh, I'm no longer administrator here. Godaddy? Whoops, invalid password. Disqus? Nope, invalid password here too. Google Adsense / Analytics? Invalid password. Dedicated server account manager? Invalid password. Now, I have the servers root password - I just built the box last week and haven't had a chance to send the guy the root password. Wasn't in a rush, I manage the server and they never touch it. Now all of a sudden all the passwords except this one are changed; the writing is on the wall - I am out. Here's the conundrum. I have the root password, they do not. If I give them this password all the leverage I have is gone, out the door and out of my hands. During this argument of why am I not getting paid the guy sends me an email saying "oh by the way, what's the root username and password to the server?". Considering he knows absolutely nothing, I gave him an "admin" account which really has almost no rights. I still have exclusive access to the server, I just don't know where to go. I can hold their data hostage, but I'm almost positive this is the wrong thing to do. I'd consider it blackmail, regardless of whether or not I have gotten paid yet. I can "break" something on the server and give them the whole "well, if you were paying me I could fix it!" spiel. This works from a "well he's not holding their stuff hostage" point of view, but what stops them from hiring some one else to just fix the issue at hand? For all I know the guys nephew is a "l33t hax0r" and can figure it out for free. I can give in, document as much as I can and take him to small claims court. This is breach of contract, I'm not getting paid. I have a case, right? ???? Does anyone have any experience in this? What can I do? What are my options? I'm broke, I can't afford a lawyer and I can barely afford not getting this paycheck. My wife doesn't work (I work two jobs so she doesn't have to work - we have a 1 year old) and is already looking at getting a part time job to cover the bills. Long term we'll be fine, but this has pissed me off beyond belief! Help me out, I'm about to get screwed.

    Read the article

  • Who could ask for more with LESS CSS? (Part 2 of 3&ndash;Setup)

    - by ToStringTheory
    Welcome to part two in my series covering the LESS CSS language.  In the first post, I covered the two major CSS precompiled languages - LESS and SASS to a small extent, iterating over some of the features that you could expect to find in them.  In this post, I will go a little further in depth into the setup and execution of using the LESS framework. Introduction It really doesn’t take too much to get LESS working in your project.  The basic workflow will be including the necessary translator in your project, defining bundles for the LESS files, add the necessary code to your layouts.cshtml file, and finally add in all your necessary styles to the LESS files!  Lets get started… New Project Just like all great experiments in Visual Studio, start up a File > New Project, and create a new MVC 4 Web Application.  The Base Package After you have the new project spun up, use the Nuget Package Manager to install the Bundle Transformer: LESS package. This will take care of installing the main translator that we will be using for LESS code (dotless which is another Nuget package), as well as the core framework for the Bundle Transformer library.  The installation will come up with some instructions in a readme file on how to modify your web.config to handle all your *.less requests through the Bundle Transformer, which passes the translating onto dotless. Where To Put These LESS Files?! This step isn’t really a requirement, however I find that I don’t like how ASP.Net MVC just has a content directory where they store CSS, content images, css images….  In my project, I went ahead and created a new directory just for styles – LESS files, CSS files, and images that are only referenced in LESS or CSS.  Ignore the MVC directory as this was my testbed for another project I was working on at the same time.  As you can see here, I have: A top level directory for images which contains only images used in a page A top level directory for scripts A top level directory for Styles A few directories for plugins I am using (Colrizr, JQueryUI, Farbtastic) Multiple *.less files for different functions (I’ll go over these in a minute) I find that this layout offers the best separation of content types.  Bring Out Your Bundles! The next thing that we need to do is add in the necessary code for the bundling of these LESS files.  Go ahead and open your BundleConfig.cs file, usually located in the /App_Start/ folder of the project.  As you will see in a minute, instead of using the method Microsoft does in the base MVC 4 project, I change things up a bit.  Define Constants The first thing I do is define constants for each of the virtual paths that will be used in the bundler: The main reason is that I hate magic strings in my program, so the fact that you first defined a virtual path in the BundleConfig file, and then used that path in the _Layout.cshtml file really irked me. Add Bundles to the BundleCollection Next, I am going to define the bundles for my styles in my AddStyleBundles method: That is all it takes to get all of my styles in play with LESS.  The CssTransformer and NullOrderer types come from the Bundle Transformer we grabbed earlier.  If we didn’t use that package, we would have to write our own function (not too hard, but why do it if it’s been done). I use the site.less file as my main hub for LESS - I will cover that more in the next section. Add Bundles To Layout.cshtml File With the constants in the BundleConfig file, instead of having to use the same magic string I defined for the bundle virtual path, I am able to do this: Notice here that besides the RenderSection magic strings (something I am working on in another side project), all of the bundles are now based on const strings.  If I need to change the virtual path, I only have to do it in one place.  Nifty! Get Started! We are now ready to roll!  As I said in the previous section, I use the site.less file as a central hub for my styles: As seen here, I have a reset.css file which is a simple CSS reset.  Next, I have created a file for managing all my color variables – colors.less: Here, you can see some of the standards I started to use, in this case for color variables.  I define all color variables with the @col prefix.  Currently, I am going for verbose variable names. The next file imported is my font.less file that defines the typeface information for the site: Simple enough.  A couple of imports for fonts from Google, and then declaring variables for use throughout LESS.  I also set up the heading sizes, margins, etc..  You can also see my current standardization for font declaration strings – @font. Next, I pull in a mixins.less file that I grabbed from the Twitter Bootstrap library that gives some useful parameterized mixins for use such as border-radius, gradient, box-shadow, etc… The common.less file is a file that just contains items that I will be defining that can be used across all my LESS files.  Kind of like my own mixins or font-helpers: Finally I have my layout.less file that contains all of my definitions for general site layout – width, main/sidebar widths, footer layout, etc: That’s it!  For the rest of my one off definitions/corrections, I am currently putting them into the site.less file beneath my original imports Note Probably my favorite side effect of using the LESS handler/translator while bundling is that it also does a CSS checkup when rendering…  See, when your web.config is set to debug, bundling will output the url to the direct less file, not the bundle, and the http handler intercepts the call, compiles the less, and returns the result.  If there is an error in your LESS code, the CSS file can be returned empty, or may have the error output as a comment on the first couple lines. If you have the web.config set to not debug, then if there is an error in your code, you will end up with the usual ASP.Net exception page (unless you catch the exception of course), with information regarding the failure of the conversion, such as brace mismatch, undefined variable, etc…  I find it nifty. Conclusion This is really just the beginning.  LESS is very powerful and exciting!  My next post will show an actual working example of why LESS is so powerful with its functions and variables…  At least I hope it will!  As for now, if you have any questions, comments, or suggestions on my current practice, I would love to hear them!  Feel free to drop a comment or shoot me an email using the contact page.  In the mean time, I plan on posting the final post in this series tomorrow or the day after, with my side project, as well as a whole base ASP.Net MVC4 templated project with LESS added in it so that you can check out the layout I have in this post.  Until next time…

    Read the article

  • ??ORACLE(?):PMON Release Lock

    - by Liu Maclean(???)
    ?????Oracle????????????PMON???????,??????ORACLE PROCESS,??cleanup dead process????release enqueue lock ,???cleanup latch? ????????????????, ????????????Pmon cleanup dead process?release lock??????????? ??Oracle=> MicroOracle, Maclean???????????Oracle behavior: SQL> select * from v$version; BANNER -------------------------------------------------------------------------------- Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production PL/SQL Release 11.2.0.3.0 - Production CORE    11.2.0.3.0      Production TNS for Linux: Version 11.2.0.3.0 - Production NLSRTL Version 11.2.0.3.0 - Production SQL> select * from global_name; GLOBAL_NAME -------------------------------------------------------------------------------- www.oracledatabase12g.com SQL> select pid,program  from v$process;        PID PROGRAM ---------- ------------------------------------------------          1 PSEUDO          2 [email protected] (PMON)          3 [email protected] (PSP0)          4 [email protected] (VKTM)          5 [email protected] (GEN0)          6 [email protected] (DIAG)          7 [email protected] (DBRM)          8 [email protected] (PING)          9 [email protected] (ACMS)         10 [email protected] (DIA0)         11 [email protected] (LMON)         12 [email protected] (LMD0)         13 [email protected] (LMS0)         14 [email protected] (RMS0)         15 [email protected] (LMHB)         16 [email protected] (MMAN)         17 [email protected] (DBW0)         18 [email protected] (LGWR)         19 [email protected] (CKPT)         20 [email protected] (SMON)         21 [email protected] (RECO)         22 [email protected] (RBAL)         23 [email protected] (ASMB)         24 [email protected] (MMON)         25 [email protected] (MMNL)         26 [email protected] (MARK)         27 [email protected] (D000)         28 [email protected] (SMCO)         29 [email protected] (S000)         30 [email protected] (LCK0)         31 [email protected] (RSMN)         32 [email protected] (TNS V1-V3)         33 [email protected] (W000)         34 [email protected] (TNS V1-V3)         35 [email protected] (TNS V1-V3)         37 [email protected] (ARC0)         38 [email protected] (ARC1)         40 [email protected] (ARC2)         41 [email protected] (ARC3)         43 [email protected] (GTX0)         44 [email protected] (RCBG)         46 [email protected] (QMNC)         47 [email protected] (TNS V1-V3)         48 [email protected] (TNS V1-V3)         49 [email protected] (Q000)         50 [email protected] (Q001)         51 [email protected] (GCR0) SQL> drop table maclean; Table dropped. SQL> create table maclean(t1 int); Table created. SQL> insert into maclean values(1); 1 row created. SQL> commit; Commit complete. ?????????, ?????????:PID=2  PMONPID=11 LMONPID=18 LGWRPID=20 SMONPID=12 LMD ??????2???”enq: TX – row lock contention”?????,???KILL??????,??????PMON?recover dead process?release TX lock: PROCESS A: QL> select addr,spid,pid from v$process where addr = ( select paddr from v$session where sid=(select distinct sid from v$mystat)); ADDR             SPID                            PID ---------------- ------------------------ ---------- 00000000BD516B80 17880                            46 SQL> select distinct sid from v$mystat;        SID ----------         22 SQL> update maclean set t1=t1+1; 1 row updated. PROCESS B SQL> select addr,spid,pid from v$process where addr = ( select paddr from v$session where sid=(select distinct sid from v$mystat)); ADDR             SPID                            PID ---------------- ------------------------ ---------- 00000000BD515AD0 17908                            45 SQL> update maclean set t1=t1+1; HANG.............. PROCESS B ??"enq: TX – row lock contention"?HANG? ????PROCESS C?? ?SMON?10500 event trace ??PMON?KST TRACE: SQL> set linesize 200 pagesize 1400 SQL> select * from v$lock where sid=22; ADDR             KADDR                   SID TY        ID1        ID2      LMODE    REQUEST      CTIME      BLOCK ---------------- ---------------- ---------- -- ---------- ---------- ---------- ---------- ---------- ---------- 00000000BDCD7618 00000000BDCD7670         22 AE        100          0          4          0         48          2 00007F63268A9E28 00007F63268A9E88         22 TM      77902          0          3          0         32          2 00000000B9BB4950 00000000B9BB49C8         22 TX     458765        892          6          0         32          1 PROCESS A holde?ENQUEUE LOCK??? AE?TM?TX SQL> alter system switch logfile; System altered. SQL> alter system checkpoint; System altered. SQL> alter system flush buffer_cache; System altered. SQL> alter system set "_trace_events"='10000-10999:255:2,20,33'; System altered. SQL> ! kill -9 17880 KILL PROCESS A ???PROCESS B??update ?PMON ? PROCESS B ?errorstack ?KST TRACE????? SQL> oradebug setorapid 2; Oracle pid: 2, Unix process pid: 17533, image: [email protected] (PMON) SQL> oradebug dump errorstack 4; Statement processed. SQL> oradebug tracefile_name /s01/orabase/diag/rdbms/vprod/VPROD1/trace/VPROD1_pmon_17533.trc SQL> oradebug setorapid 45; Oracle pid: 45, Unix process pid: 17908, image: [email protected] (TNS V1-V3) SQL> oradebug dump errorstack 4; Statement processed. SQL>oradebug tracefile_name /s01/orabase/diag/rdbms/vprod/VPROD1/trace/VPROD1_ora_17908.trc ??PMON? KST TRACE: 2012-05-18 10:37:34.557225 :8001ECE8:db_trace:ktur.c@5692:ktugru(): [10444:2:1] next rollback uba: 0x00000000.0000.00 2012-05-18 10:37:34.557382 :8001ECE9:db_trace:ksl2.c@16009:ksl_update_post_stats(): [10005:2:1] KSL POST SENT postee=18 num=4 loc='ksa2.h LINE:285 ID:ksasnd' id1=0 id2=0 name=   type=0 2012-05-18 10:37:34.557514 :8001ECEA:db_trace:ksq.c@8540:ksqrcli(): [10704:2:1] ksqrcl: release TX-0007000d-0000037c mode=X 2012-05-18 10:37:34.558819 :8001ECF0:db_trace:ksl2.c@16009:ksl_update_post_stats(): [10005:2:1] KSL POST SENT postee=45 num=5 loc='kji.h LINE:3418 ID:kjata: wake up enqueue owner' id1=0 id2=0 name=   type=0 2012-05-18 10:37:34.559047 :8001ECF8:db_trace:ksl2.c@16009:ksl_update_post_stats(): [10005:2:1] KSL POST SENT postee=12 num=6 loc='kjm.h LINE:1224 ID:kjmpost: post lmd' id1=0 id2=0 name=   type=0 2012-05-18 10:37:34.559271 :8001ECFC:db_trace:ksq.c@8826:ksqrcli(): [10704:2:1] ksqrcl: SUCCESS 2012-05-18 10:37:34.559291 :8001ECFD:db_trace:ktu.c@8652:ktudnx(): [10813:2:1] ktudnx: dec cnt xid:7.13.892 nax:0 nbx:0 2012-05-18 10:37:34.559301 :8001ECFE:db_trace:ktur.c@3198:ktuabt(): [10444:2:1] ABORT TRANSACTION - xid: 0x0007.00d.0000037c 2012-05-18 10:37:34.559327 :8001ECFF:db_trace:ksq.c@8540:ksqrcli(): [10704:2:1] ksqrcl: release TM-0001304e-00000000 mode=SX 2012-05-18 10:37:34.559365 :8001ED00:db_trace:ksq.c@8826:ksqrcli(): [10704:2:1] ksqrcl: SUCCESS 2012-05-18 10:37:34.559908 :8001ED01:db_trace:ksq.c@8540:ksqrcli(): [10704:2:1] ksqrcl: release AE-00000064-00000000 mode=S 2012-05-18 10:37:34.559982 :8001ED02:db_trace:ksq.c@8826:ksqrcli(): [10704:2:1] ksqrcl: SUCCESS 2012-05-18 10:37:34.560217 :8001ED03:db_trace:ksfd.c@15379:ksfdfods(): [10298:2:1] ksfdfods:fob=0xbab87b48 aiopend=0 2012-05-18 10:37:34.560336 :GSIPC:kjcs.c@4876:kjcsombdi(): GSIPC:SOD: 0xbc79e0c8 action 3 state 0 chunk (nil) regq 0xbc79e108 batq 0xbc79e118 2012-05-18 10:37:34.560357 :GSIPC:kjcs.c@5293:kjcsombdi(): GSIPC:SOD: exit cleanup for 0xbc79e0c8 rc: 1, loc: 0x303 2012-05-18 10:37:34.560375 :8001ED04:db_trace:kss.c@1414:kssdch(): [10809:2:1] kssdch(0xbd516b80 = process, 3) 1 0 exit 2012-05-18 10:37:34.560939 :8001ED06:db_trace:kmm.c@10578:kmmlrl(): [10257:2:1] KMMLRL: Entering: flg(0x0) rflg(0x4) 2012-05-18 10:37:34.561091 :8001ED07:db_trace:kmm.c@10472:kmmlrl_process_events(): [10257:2:1] KMMLRL: Events: succ(3) wait(0) fail(0) 2012-05-18 10:37:34.561100 :8001ED08:db_trace:kmm.c@11279:kmmlrl(): [10257:2:1] KMMLRL: Reg/update: flg(0x0) rflg(0x4) 2012-05-18 10:37:34.563325 :8001ED0B:db_trace:kmm.c@12511:kmmlrl(): [10257:2:1] KMMLRL: Update: ret(0) 2012-05-18 10:37:34.563335 :8001ED0C:db_trace:kmm.c@12768:kmmlrl(): [10257:2:1] KMMLRL: Exiting: flg(0x0) rflg(0x4) 2012-05-18 10:37:34.563354 :8001ED0D:db_trace:ksl2.c@2598:kslwtbctx(): [10005:2:1] KSL WAIT BEG [pmon timer] 300/0x12c 0/0x0 0/0x0 wait_id=78 seq_num=79 snap_id=1 PMON??dead process A??????????TX Lock:ksqrcl: release TX-0007000d-0000037c mode=X ?????Post Process B,??Process B ?acquire?TX lock???????:KSL POST SENT postee=45 num=5 loc=’kji.h LINE:3418 ID:kjata: wake up enqueue owner’ id1=0 id2=0 name=   type=0 Process B???PMON??????????ksl2.c@14563:ksliwat(): [10005:45:151] KSL POST RCVD poster=2 num=5 loc=’kji.h LINE:3418 ID:kjata: wake up enqueue owner’ id1=0 id2=0 name=   type=0 fac#=3 posted=0×3 may_be_posted=1kslwtbctx(): [10005:45:151] KSL WAIT BEG [latch: ges resource hash list] 3162668560/0xbc827e10 91/0x5b 0/0×0 wait_id=14 seq_num=15 snap_id=1kslwtectx(): [10005:45:151] KSL WAIT END [latch: ges resource hash list] 3162668560/0xbc827e10 91/0x5b 0/0×0 wait_id=14 seq_num=15 snap_id=1 ?RAC????POST LMD(lock Manager)??,????????GES??:2012-05-18 10:37:34.559047 :8001ECF8:db_trace:ksl2.c@16009:ksl_update_post_stats(): [10005:2:1] KSL POST SENT postee=12 num=6 loc=’kjm.h LINE:1224 ID:kjmpost: post lmd’ id1=0 id2=0 name=   type=0 ??ksqrcl: release TX????????:ksq.c@8826:ksqrcli(): [10704:2:1] ksqrcl: SUCCESS ??PMON abort Process A???Transaction2012-05-18 10:37:34.559291 :8001ECFD:db_trace:ktu.c@8652:ktudnx(): [10813:2:1] ktudnx: dec cnt xid:7.13.892 nax:0 nbx:02012-05-18 10:37:34.559301 :8001ECFE:db_trace:ktur.c@3198:ktuabt(): [10444:2:1] ABORT TRANSACTION – xid: 0×0007.00d.0000037c ??Process A?????maclean??TM lock:ksq.c@8540:ksqrcli(): [10704:2:1] ksqrcl: release TM-0001304e-00000000 mode=SXksq.c@8826:ksqrcli(): [10704:2:1] ksqrcl: SUCCESS ??Process A?????AE ( Prevent Dropping an edition in use) lock:ksq.c@8540:ksqrcli(): [10704:2:1] ksqrcl: release AE-00000064-00000000 mode=Sksq.c@8826:ksqrcli(): [10704:2:1] ksqrcl: SUCCESS ??cleanup process Akjcs.c@4876:kjcsombdi(): GSIPC:SOD: 0xbc79e0c8 action 3 state 0 chunk (nil) regq 0xbc79e108 batq 0xbc79e118GSIPC:kjcs.c@5293:kjcsombdi(): GSIPC:SOD: exit cleanup for 0xbc79e0c8 rc: 1, loc: 0×303kss.c@1414:kssdch(): [10809:2:1] kssdch(0xbd516b80 = process, 3) 1 0 exit 0xbd516b80??PROCESS A ?paddr ???? kssdch???????? ??process???state object SO KSS: delete children of state obj. PMON ??kmmlrl()????instance goodness??update for session drop deltakmmlrl(): [10257:2:1] KMMLRL: Entering: flg(0×0) rflg(0×4)kmmlrl_process_events(): [10257:2:1] KMMLRL: Events: succ(3) wait(0) fail(0)kmmlrl(): [10257:2:1] KMMLRL: Reg/update: flg(0×0) rflg(0×4)kmmlrl(): [10257:2:1] KMMLRL: Update: ret(0)kmmlrl(): [10257:2:1] KMMLRL: Exiting: flg(0×0) rflg(0×4) ????????PMON???? 3s???”pmon timer”??kslwtbctx(): [10005:2:1] KSL WAIT BEG [pmon timer] 300/0x12c 0/0×0 0/0×0 wait_id=78 seq_num=79 snap_id=1

    Read the article

  • SQL Server 2005: Improving performance for thousands or Insert requests. logout-login time= 120ms.

    - by Rad
    Can somebody shed some lights on how SQL Server 2005 deals with may request issued by a client using ADO.NET 2.0. Below is the shortend output of SQL Trace. I can see that connection pooling is working (I believe there is only one connection being pooled). What is not clear to me is why we have so many sp_reset_connection calls i.e a series of: Audit Login, SQL:BatchStarting, RPC:Starting and Audit Logout for each loop in for loop below. I can see that there is constant switching between tempdb and master database which leads me to conclude that we lost the context when next connection is created by fetching it from the pool based on ConectionString argument. I can see that every 15ms I can get 100-200 login/logout per second (reported at the same time by Profiler). The after 15ms I have again a series fo 100-200 login/logout per second. I need clarification on how this might affect much complex insert queries in production environment. I use Enterprise Library 2006, the code is compiled with VS 2005 and it is a console application that parses a flat file with 10 of thousand of rows grouping parent-child rows, runs on an application server and runs 2 stored procedure on a remote SQL Server 2005 inserting a parent record, retrieves Identity value and using it calls the second stored procedure 1, 2 or multiple times (sometimes several thousands) inserting child records. The child table has close to 10 million records with 5-10 indexes some of them being covering non-clustered. There is a pretty complex Insert trigger that copies inserted detail record to an archive table. All in all I only have 7 inserts per second which means it can take 2-4 hours for 50 thousand records. When I run Profiler on the test server (that is almost equivalent with production server) I can see that there is about 120ms between Audit Logout and Audit Login trace entries which almost give me chance to insert about 8 records. So my question is if there is some way to improve inserting of records since the company loads 100 thousands of records and does daily planning and has SLA to fulfill client request coming as flat file orders and some big files 10 thousands have to be processed(imported quickly). 4 hours to import 60 thousands should be reduced to 30 minutes. I was thinking to use BatchSize of DataAdapter to send multiple stored procedure calls, SQL Bulk inserts to batch multiple inserts from DataReader or DataTable, SSIS fast load. But I don't know how to properly analyze re-indexing and stats population and maybe this has to take some time to finish. What is worse is that the company uses the biggest table for reporting and other online processing and indexes cannot be dropped. I manage transaction manually by setting a field to a value and do an transactional update changing that value to a new value that other applications are using to get committed rows. Please advise how to approach this problem. For now I am trying to have a staging tables with minimal logging in a separate database and no indexes and I will try to do batched (massive) parent child inserts. I believe Production DB has simple recovery model, but it could be full recovery. If DB user that is being used by my .NET console application has bulkadmin role does it mean its bulk inserts are minimally logged. I understand that when a table has clustered and many non-clustered indexes that inserts are still logged for each row. Connection pooling is working, but with many login/logouts. Why? for (int i = 1; i <= 10000; i++){ using (SqlConnection conn = new SqlConnection("server=(local);database=master;integrated security=sspi;")) {conn.Open(); using (SqlCommand cmd = conn.CreateCommand()){ cmd.CommandText = "use tempdb"; cmd.ExecuteNonQuery();}}} SQL Server Profiler trace: Audit Login master 2010-01-13 23:18:45.337 1 - Nonpooled SQL:BatchStarting use tempdb master 2010-01-13 23:18:45.337 RPC:Starting exec sp_reset_conn tempdb 2010-01-13 23:18:45.337 Audit Logout tempdb 2010-01-13 23:18:45.337 2 - Pooled Audit Login -- network protocol master 2010-01-13 23:18:45.383 2 - Pooled SQL:BatchStarting use tempdb master 2010-01-13 23:18:45.383 RPC:Starting exec sp_reset_conn tempdb 2010-01-13 23:18:45.383 Audit Logout tempdb 2010-01-13 23:18:45.383 2 - Pooled Audit Login -- network protocol master 2010-01-13 23:18:45.383 2 - Pooled SQL:BatchStarting use tempdb master 2010-01-13 23:18:45.383 RPC:Starting exec sp_reset_conn tempdb 2010-01-13 23:18:45.383 Audit Logout tempdb 2010-01-13 23:18:45.383 2 - Pooled

    Read the article

  • Scrum Master Stephen Forte Teaches Agile Development, Silverlight and BI at GIDS 2010

    - by rajesh ahuja
    Great Indian Developer Summit 2010 – Gold Standard for India's Software Developer Ecosystem Bangalore, March 25, 2010: The author of several books on application and database development including Programming SQL Server 2008 and certified Scrum Master Stephen Forte is coming this summer to India's biggest summit for the developer ecosystem - Great Indian Developer Summit. At the summit, Stephen will conduct a workshop guaranteed to give attendees a jump start in taking a certified scrum master exam. Scrum, one of the most popular Agile project management and development methods, which is starting to be adopted at major corporations and on very large projects. After an introduction to the basics of Scrum like project planning and estimation, the Scrum Master, team, product owner and burn down, and of course the daily Scrum, Stephen will show many real world applications of the methodology drawn from his own experience as a Scrum Master. Negotiating with the business, estimation and team dynamics are all discussed as well as how to use Scrum in small organizations, large enterprise environments and consulting environments. Stephen will also discuss using Scrum with virtual teams and an off-shoring environment. He will then take a look at the tools we will use for Agile development, including planning poker, unit testing, and much more. On 20th April at the GIDS.NET Conference, Stephen will also conduct a series of sessions on Microsoft computing technologies. He will teach how to build data driven, n-tier Rich Internet Applications (RIA) with Silverlight 4.0. Line of business applications (LOB) in Silverlight 4.0 are easy by tapping the power of WCF RIA Services, the Silverlight Toolkit, and elevated out of browser support. Stephen's demo centric session will walk you through an example of building a LOB application with Silverlight 4.0. See how Silverlight and WCF RIA Services support domain logic, services, data binding, validation, server based paging, authentication, authorization and much more. Silverlight 4.0 means business. Silverlight runs C# and Visual Basic code, and so it seems natural that a business application might share some code between the Silverlight client and its ASP.NET Web server. You may want to run some code client-side for interactivity, but re-run that code on the server for security or reliability. This is possible, and there are several techniques you can use to accomplish this goal. In Stephen's second talk learn about the various techniques and their pros and cons. Some techniques work better in C#, others in VB. Still others are simpler with a little extra tooling or code-generation. Any serious Silverlight business application will almost certainly face this issue, and this session gets you going fast. In the third talk, Stephen will explain how to properly architect and deploy a BI application using a mix of some exciting new tools and some old familiar ones. He will start with a traditional relational transaction centric database (OLTP) and explore ways to build a data warehouse (OLAP), looking at the star and snowflake schemas. Next he will look at the process of extraction, transformation, and loading (ETL) your OLTP data into your data warehouse. Different techniques for ETL will be described and the various tradeoffs will be discussed. Then he will look at using the warehouse for reporting, drill down, and data analysis in Microsoft Excel's PowerPivot 2010. The session will round off by showing how to properly build a cube and build a data analysis application on top of that cube, and conclude by looking at some tools to help with the data visualization process. Every year, GIDS is a game changer for several thousands of IT professionals, providing them with a competitive edge over their peers, enlightening them with bleeding-edge information most useful in their daily jobs, helping them network with world-class experts and visionaries, and providing them with a much needed thrust in their careers. Attend Great Indian Developer Summit to gain the information, education and solutions you seek. From post-conference workshops, breakout sessions by expert instructors, keynotes by industry heavyweights, enhanced networking opportunities, and more. About Great Indian Developer Summit Great Indian Developer Summit is the gold standard for India's software developer ecosystem for gaining exposure to and evaluating new projects, tools, services, platforms, languages, software and standards. Packed with premium knowledge, action plans and advise from been-there-done-it veterans, creators, and visionaries, the 2010 edition of Great Indian Developer Summit features focused sessions, case studies, workshops and power panels that will transform you into a force to reckon with. Featuring 3 co-located conferences: GIDS.NET, GIDS.Web, GIDS.Java and an exclusive day of in-depth tutorials - GIDS.Workshops, from 20 April to 24 April at the IISc campus in Bangalore. At GIDS you'll participate in hundreds of sessions encompassing the full range of Microsoft computing, Java, Agile, RIA, Rich Web, open source/standards, languages, frameworks and platforms, practical tutorials that deep dive into technical skill and best practices, inspirational keynote presentations, an Expo Hall featuring dozens of the latest projects and products activities, engaging networking events, and the interact with the best and brightest of speakers from around the world. For further information on GIDS 2010, please visit the summit on the web http://www.developersummit.com/ A Saltmarch Media Press Release E: [email protected] Ph: +91 80 4005 1000

    Read the article

  • Ajax, Lizard Brain Web Design, JSF, Struts, JavaScript, Mobile Web, Flash, jQuery, GWT, Harmony at I

    - by Kim Won
    Great Indian Developer Summit 2010 – India's Biggest Polyglot Conference and Workshops for IT Software Professionals Bangalore, April 9, 2010: The GIDS.Web Conference and Workshops has announced the complete program of over 30 sessions on how browser and rich web technologies such as AJAX, DHTML, Mashups, Web 2.0, Enterprise 2.0 technologies, and Rich UI technologies are making money and gaining market-share for some of the leading businesses in the world. The GIDS.Web track at Great Indian Developer Summit takes place 21 and 23 April 2010, at the Indian Institute of Science in Bangalore. As one of the longest running independent developer conferences in India, GIDS.Web at the Great Indian Developer Summit 2010 is uniquely positioned to provide a blend of practical, pragmatic and immediately applicable knowledge and a glimpse of the future of technology. During 21 and 23 April 2010, GIDS.Web offers a multi-track conference, workshops, expo show floor, and networking opportunities. The first keynote at GIDS.Web is led by the leading Java EE and Ajax developer, speaker, and author Marty Hall. The best of India's Java and RIA programmers have learnt the subject from Marty's seminal books Core Servlets and JavaServer Pages (first and second editions), More Servlets and JavaServer Pages, and Core Web Programming (first and second editions) from Prentice Hall and Sun Microsystems Press. Marty's keynote address is a comparison of approaches to building rich Internet applications with Ajax. Marty says Ajax development is difficult, and there are several fundamentally different strategies to building Ajaxified Web applications. The keynote address will survey the three most important of these approaches: using an Ajax-enabled JavaScript library such as jQuery, Prototype, Scriptaculous, Dojo, or Ext/JS; using a Web framework such as JSF 2.0 or Struts 2 that has integrated Ajax support; using the Google Web Toolkit (GWT) to build "pure Java" Ajax applications. The talk will compare and contrast these three approaches, discussing the types of applications that fit best for each option. Over the course of the summit Marty will conduct several more sessions on "Choosing an Ajax/JavaScript Toolkit: A Comparison of the Most Popular JavaScript Libraries", "Pure Java Ajax: An Overview of GWT 2.0", "Integrated Ajax Support in JSF 2.0" and "Ajax Support in the Prototype JavaScript Library". The second keynote by the head of Adobe's Flash initiative in India, Ramesh Srinivasaraghavan, explores the state of art in web application development and identify trends that could transform the way we create and use web applications. The talk explains how the Adobe Flash Platform has fuelled this revolution with an integrated set of technologies for delivering the most compelling applications, content and video to the widest possible audience. The Director of Forum Nokia will explain how cloud computing coupled with mobile applications enable consumers to have access to powerful services and improved user experiences never before thought possible. IEEE's 2010 President-Elect Sorel Reisman's afternoon address steps to improve the IT profession in India. Featured talks at GID.Web also include: Web 2.0 Checklist - Deconstructing Modern Websites, Scott Davis Choosing an Ajax/JavaScript Toolkit: Comparison of Popular JavaScript Libraries, Marty Hall Lizard Brain Web Design, Scott Davis Effective Design Processes and Resources for Mobile Web Development, Arabella David NoSQL: The Shift to a Non-relational World, Nosh Petigara Open Source Web Debugging Tools, Matthew McCullough Building Line of Business Applications with Silverlight 4.0, Stephen Forte Hadoop - Divide and Conquer, Matthew McCullough Adobe Flash Catalyst for Agile Interaction Design, Harish Sivaramakrishnan Using jQuery and AJAX to Build Front-ends for ASP.NET and ASP.NET MVC, Pandurang Nayak First Steps to IT Heaven Through the Cloud. Part II: .WEB, Simone Brunozzi Building Rich Internet Applications with SL RIA Web Services, Pandurang Nayak Enriching Cloud Applications with Adobe Flash Platform, Ramesh Srinivasaraghavan Payments for the Web.future, Khurram Khan and Praveen Alavilli Longevity of Scalable Systems, Nishad Kamat Transform yourself into a Mobile App Developer Using Web Run Time, Balagopal K S Developing Multi Screen Applications on Adobe Flash Platform, Hemanth Sharma Why Harmony and For Whom?, Himanshu Goyal IIS Hosting Solution for ASP.net and PHP Web Sites, Nahas Mohammed Building Pluggable Web applications using Django, Lakshman Prasad Workshop: The 180-min AJAX and JSON Spike Class, Scott Davis Workshop: Essence of Functional Programming, Venkat Subramaniam Workshop: Agile Development, Tools, and Teams and Scrum Certification, Stephen Forte Workshop: PHP + Adobe Flex = Killer RIA, Shyamprasad P Workshop: Cloud Computing Boot Camp on the Google App Engine, Matthew McCullough Workshop: Building Data Centric Applications using Adobe Flex and Java, Prashant Singh Workshop: Building Your First Amazon App, Simone Brunozzi Workshop: Windows Azure Deep Dive, Ramaprasanna Chellamuthu Workshop: Monetizing your Apps with PayPal X Payments Platform, Khurram Khan, Praveen Alavilli Workshop: User Expereince Evaluation Model Walkthrough, Sanna Häiväläinen Sponsors of Great Indian Developer Summit 2010 include: Platinum sponsors Microsoft, Oracle Forum Nokia and Adobe; Gold sponsors Intel and SAP; Silver sponsors Quest Software, PayPal, Telerik and AMT. About Great Indian Developer Summit Great Indian Developer Summit is the gold standard for India's software developer ecosystem for gaining exposure to and evaluating new projects, tools, services, platforms, languages, software and standards. Packed with premium knowledge, action plans and advise from been-there-done-it veterans, creators, and visionaries, the 2010 edition of Great Indian Developer Summit features focused sessions, case studies, workshops and power panels that will transform you into a force to reckon with. Featuring 3 co-located conferences: GIDS.NET, GIDS.Web, GIDS.Java and an exclusive day of in-depth tutorials - GIDS.Workshops, from 20 April to 24 April at the IISc campus in Bangalore. At GIDS you'll participate in hundreds of sessions encompassing the full range of Microsoft computing, Java, Agile, RIA, Rich Web, open source/standards, languages, frameworks and platforms, practical tutorials that deep dive into technical skill and best practices, inspirational keynote presentations, an Expo Hall featuring dozens of the latest projects and products activities, engaging networking events, and the interact with the best and brightest of speakers from around the world. For further information on GIDS 2010, please visit the summit on the web http://www.developersummit.com/ A Saltmarch Media Press Release E: [email protected] Ph: +91 80 4005 1000

    Read the article

  • Remote EJB lookup issue with WebSphere 6.1

    - by marc dauncey
    I've seen this question asked before, but I've tried various solutions proposed, to no avail. Essentially, I have two EJB enterprise applications, that need to communicate with one another. The first is a web application, the second is a search server - they are located on different development servers, not in the same node, cell, or JVM, although they are on the same physical box. I'm doing the JNDI lookup via IIOP, and the URL I am using is as follows: iiop://searchserver:2819 In my hosts file, I've set searchserver to 127.0.0.1. The ports for my search server are bound to this hostname too. However, when the web app (that uses Spring btw) attempts to lookup the search EJB, it fails with the following error. This is driving me nuts, surely this kind of comms between the servers should be fairly simple to get working. I've checked the ports and they are correct. I note that the exception says the initial context is H00723Node03Cell/nodes/H00723Node03/servers/server1, name: ejb/com/hmv/dataaccess/ejb/hmvsearch/HMVSearchHome. This is the web apps server NOT the search server. Is this correct? How can I get Spring to use the right context? [08/06/10 17:14:28:655 BST] 00000028 SystemErr R org.springframework.remoting.RemoteLookupFailureException: Failed to locate remote EJB [ejb/com/hmv/dataaccess/ejb/hmvsearch/HMVSearchHome]; nested exception is javax.naming.NameNotFoundException: Context: H00723Node03Cell/nodes/H00723Node03/servers/server1, name: ejb/com/hmv/dataaccess/ejb/hmvsearch/HMVSearchHome: First component in name hmvsearch/HMVSearchHome not found. [Root exception is org.omg.CosNaming.NamingContextPackage.NotFound: IDL:omg.org/CosNaming/NamingContext/NotFound:1.0] at org.springframework.ejb.access.SimpleRemoteSlsbInvokerInterceptor.doInvoke(SimpleRemoteSlsbInvokerInterceptor.java:101) at org.springframework.ejb.access.AbstractRemoteSlsbInvokerInterceptor.invoke(AbstractRemoteSlsbInvokerInterceptor.java:140) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204) at $Proxy7.doSearchByProductKeywordsForKiosk(Unknown Source) at com.hmv.web.usecases.search.SearchUC.execute(SearchUC.java:128) at com.hmv.web.actions.search.SearchAction.executeAction(SearchAction.java:129) at com.hmv.web.actions.search.KioskSearchAction.executeAction(KioskSearchAction.java:37) at com.hmv.web.actions.HMVAbstractAction.execute(HMVAbstractAction.java:123) at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:484) at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:274) at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1482) at com.hmv.web.controller.HMVActionServlet.process(HMVActionServlet.java:149) at org.apache.struts.action.ActionServlet.doGet(ActionServlet.java:507) at javax.servlet.http.HttpServlet.service(HttpServlet.java:743) at javax.servlet.http.HttpServlet.service(HttpServlet.java:856) at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1282) at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1239) at com.ibm.ws.webcontainer.filter.WebAppFilterChain.doFilter(WebAppFilterChain.java:136) at com.hmv.web.support.SessionFilter.doFilter(SessionFilter.java:137) at com.ibm.ws.webcontainer.filter.FilterInstanceWrapper.doFilter(FilterInstanceWrapper.java:142) at com.ibm.ws.webcontainer.filter.WebAppFilterChain.doFilter(WebAppFilterChain.java:121) at com.ibm.ws.webcontainer.filter.WebAppFilterChain._doFilter(WebAppFilterChain.java:82) at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:670) at com.ibm.ws.webcontainer.webapp.WebApp.handleRequest(WebApp.java:2933) at com.ibm.ws.webcontainer.webapp.WebGroup.handleRequest(WebGroup.java:221) at com.ibm.ws.webcontainer.VirtualHost.handleRequest(VirtualHost.java:210) at com.ibm.ws.webcontainer.WebContainer.handleRequest(WebContainer.java:1912) at com.ibm.ws.webcontainer.channel.WCChannelLink.ready(WCChannelLink.java:84) at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.handleDiscrimination(HttpInboundLink.java:472) at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.handleNewInformation(HttpInboundLink.java:411) at com.ibm.ws.http.channel.inbound.impl.HttpICLReadCallback.complete(HttpICLReadCallback.java:101) at com.ibm.ws.tcp.channel.impl.WorkQueueManager.requestComplete(WorkQueueManager.java:566) at com.ibm.ws.tcp.channel.impl.WorkQueueManager.attemptIO(WorkQueueManager.java:619) at com.ibm.ws.tcp.channel.impl.WorkQueueManager.workerRun(WorkQueueManager.java:952) at com.ibm.ws.tcp.channel.impl.WorkQueueManager$Worker.run(WorkQueueManager.java:1039) at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:1462) Caused by: javax.naming.NameNotFoundException: Context: H00723Node03Cell/nodes/H00723Node03/servers/server1, name: ejb/com/hmv/dataaccess/ejb/hmvsearch/HMVSearchHome: First component in name hmvsearch/HMVSearchHome not found. [Root exception is org.omg.CosNaming.NamingContextPackage.NotFound: IDL:omg.org/CosNaming/NamingContext/NotFound:1.0] at com.ibm.ws.naming.jndicos.CNContextImpl.processNotFoundException(CNContextImpl.java:4392) at com.ibm.ws.naming.jndicos.CNContextImpl.doLookup(CNContextImpl.java:1752) at com.ibm.ws.naming.jndicos.CNContextImpl.doLookup(CNContextImpl.java:1707) at com.ibm.ws.naming.jndicos.CNContextImpl.lookupExt(CNContextImpl.java:1412) at com.ibm.ws.naming.jndicos.CNContextImpl.lookup(CNContextImpl.java:1290) at com.ibm.ws.naming.util.WsnInitCtx.lookup(WsnInitCtx.java:145) at javax.naming.InitialContext.lookup(InitialContext.java:361) at org.springframework.jndi.JndiTemplate$1.doInContext(JndiTemplate.java:132) at org.springframework.jndi.JndiTemplate.execute(JndiTemplate.java:88) at org.springframework.jndi.JndiTemplate.lookup(JndiTemplate.java:130) at org.springframework.jndi.JndiTemplate.lookup(JndiTemplate.java:155) at org.springframework.jndi.JndiLocatorSupport.lookup(JndiLocatorSupport.java:95) at org.springframework.jndi.JndiObjectLocator.lookup(JndiObjectLocator.java:105) at org.springframework.ejb.access.AbstractRemoteSlsbInvokerInterceptor.lookup(AbstractRemoteSlsbInvokerInterceptor.java:98) at org.springframework.ejb.access.AbstractSlsbInvokerInterceptor.getHome(AbstractSlsbInvokerInterceptor.java:143) at org.springframework.ejb.access.AbstractSlsbInvokerInterceptor.create(AbstractSlsbInvokerInterceptor.java:172) at org.springframework.ejb.access.AbstractRemoteSlsbInvokerInterceptor.newSessionBeanInstance(AbstractRemoteSlsbInvokerInterceptor.java:226) at org.springframework.ejb.access.SimpleRemoteSlsbInvokerInterceptor.getSessionBeanInstance(SimpleRemoteSlsbInvokerInterceptor.java:141) at org.springframework.ejb.access.SimpleRemoteSlsbInvokerInterceptor.doInvoke(SimpleRemoteSlsbInvokerInterceptor.java:97) ... 36 more Many thanks for any assistance! Marc

    Read the article

  • Architecture choice about representation of collections in Business Objects

    - by Rajarshi
    I have made certain choices in my architecture which I request the community to review and comment. I am breaking up the post in smaller sections to make it easier to understand the context and then suggest/comment. I am sorry that the post is long, but is required to explain the context. What am I building A typical business application where there are application users, security roles, business operation/action rights based on roles and several business modules like Stock Receive, Stock Transfer, Sale Order, Sale Invoice, Sale Return, Stock Audit etc. and several reports. The application is a WinForm application since it has a lot of rich and responsive UI requirements and has to operate in disconnected mode (with a local SQL Server), most of the time. What have I done I have built a framework - nothing to boast about, but just a set of libraries that serves the repetative requirements of my application, e.g. authentication, role based authorization, data access, validation, exception handling, logging, change status tracking, presentation model compliance and reasonable loose coupling between components. No, I have not written everything from scratch, you can say I have consolidated many things together like some concepts from CSLA, Martin Fowler for Presentation Model, blocks from Enterprise Library, Unity etc. to build a set of libraries that will help my developers be productive quickly without having to look up Google for many of the technical requirements. I have tried to keep the framework generic so that it can be used in typical business applications and also tried to follow some best practices that will support the same Business Objects to be used in an ASP.NET MVC environment also. My present architecture serves my objectives well, and have built several modules (on WinForm) without much trouble. The architecture also lent itself well to build some usable prototype on ASP.NET MVC with the same set of business objects, without changing a single line of code. My Dilemma I have used Custom Business Objects since that gives me a clearer OOP representation of the problem scope in my solution scope, and helps me visualize my entire solution as collection of objects with data and behavior rather than having a set of relational data (DataSet) and implement behaviours (business logic, validation) etc. separately. With rich databinding support in .NET 2.0 binding Custom Business Objects to UI was a breeze. Now while building my business objects, I am still in a dilemma about representation of collections in business objects. Currently I am using DataSets to represent collections while I have seen many suggestions to implement custom collections. For example, in my vision, a typical Sale Invoice Object will contain 'Sales Invoice Items' as a collection. Now theoritically, I can accept that the each 'Sales Invoice Item' should have its own behavior along with their data (ItemCode, Name, Qty, Price etc.) but typically managing of Sale Invoice Items in a Sale Invoice is handled by the Sale Invoice Object itself, e.g. adding/removing Items from collection. Additionally, we can also put business logic/rules for the Sales Invoice Items like "Qty should not be greater than the ordered qty", "Price should be max 10% above the price in Sale Order" etc. in the Sale Invoice object itself. With that kind of a vision, I felt that most business object child collections can be managed by the parent itself, including add/remove from collection as well and implementing business logic for the collection items, hence the collection items hold nothing but data. Additionally, typical collections are represented in UI in Grids, where ability to support DataBinding becomes very important for any collection. Implementing a custom collection, in that case would also mean, I have to implement robust DataBinding support as well, for the collection, which is of course time consuming. Now, considering child collection behaviors are implemented in the parent and the need for DataBinding of child collections, I chose DataSet to represent any child collection in my business objects. In the above example of Sale Invoice I will have 'Invoice Number', 'Date', 'Customer' etc. as attributes of the 'Sale Invoice' but 'InvoiceItems' as a DataSet. Of course, when I say DataSet, it is not a vanilla dataset but an extended DataSet that supports business rule validation and the same role based security model of my framework to allow/deny any business operation to rows/columns of the DataSet, automatically. This approach has allowed easier collection management and databinding in my business objects and my developers are able to deliver modules rapidly. Questions Do you feel that the approach is reasonable? Do you see any shortcomings of this approach? I am recently thinking of using 'Typed DataSets' as child collections, for easier representation in code, that will allow me to write 'currentInvoice.InvoiceItems' (for the DataTable) and 'invoiceItem.ProductCode' or 'invoiceItem.Qty', instead of 'drow["ProductCode"].ToString()' or '(int)drow["Qty"]' etc. Does this choice have any demerits? Thank you if you have read so far and a salute if you still have the Energy to answer.

    Read the article

  • What is the MVC version of this code?

    - by Ian Boyd
    i'm trying to wrap my head around how to enterprise up my code: taking a simple routine and splitting it up into 5 or 6 methods in 3 or 4 classes. i quickly came up three simple examples of code how i currently write it. Could someone please convert these into an MVC/MVP obfuscated version? Example 1: The last name is mandatory. Color the text box red if nothing is entered. Color it green if stuff is entered: private void txtLastname_TextChanged(object sender, EventArgs e) { //Lastname mandatory. //Color pinkish if nothing entered. Greenish if entered. if (txtLastname.Text.Trim() == "") { //Lastname is required, color pinkish txtLastname.BackColor = ControlBad; } else { //Lastname entered, remove the coloring txtLastname.BackColor = ControlGood; } } Example 2: The first name is optional, but try to get it. We'll add a bluish tint to this "try to get" field: private void txtFirstname_TextChanged(object sender, EventArgs e) { //Firstname can be blank. //Hint them that they should *try* to get it with a bluish color. //If they do enter stuff: it better be not all spaces. if (txtFirstname.Text == "") { //Nothing there, hint it blue txtFirstname.BackColor = ControlRequired; } else if (txtFirstname.Text.Trim() == "") { //They entered spaces - bad user! txtFirstname.BackColor = ControlBad; } else { //Entered stuff, remove coloring txtFirstname.BackColor = SystemColors.Window; } } Example 3 The age is totally optional. If an age is entered, it better be valid: private void txtAge_TextChanged(object sender, EventArgs e) { //Age is optional, but if entered it better be valid int nAge = 0; if (Int32.TryParse(txtAge.Text, out nAge)) { //Valid integer entered if (nAge < 0) { //Negative age? i don't think so txtAge.BackColor = ControlBad; } else { //Valid age entered, remove coloring txtAge.BackColor = SystemColors.Window; } } else { //Whatever is in there: it's *not* a valid integer, if (txtAge.Text == "") { //Blank is okay txtAge.BackColor = SystemColors.Window; } else { //Not a valid age, bad user txtAge.BackColor = ControlBad; } } } Every time i see MVC code, it looks almost like random splitting of code into different methods, classes, and files. i've not been able to determine a reason or pattern to their madness. Without any understanding of they why it's being one some way, it makes no sense. And using the words model, view, controller and presenter, like i'm supposed to know what that means, doesn't help. The model is your data. The view shows data on screen. The controller is used to carry out the users actions And oranges taste orangy. Here's my attempt at splitting things up in order to make the code more difficult to follow. Is this anywhere close to MVC? private void txtFirstname_TextChanged(object sender, EventArgs e) { FirstnameTextChangedHandler(sender, e); } private void FirstnameTextChangedHandler(sender, e) { string firstname = GetFirstname(); Color firstnameTextBoxColor = GetFirstnameTextBoxColor(firstname); SetFirstNameTextBoxColor(firstnameTextBoxColor); } private string GetFirstname() { return txtFirstname.Text; } private Color GetFirstnameTextBoxColor(string firstname) { //Firstname can be blank. //Hint them that they should *try* to get it with a bluish color. //If they do enter stuff: it better be not all spaces. if (firstname == "") { //Nothing there, hint it blue return GetControlRequiredColor(); } else if (firstname.Trim() == "") { //They entered spaces - bad user! return GetControlBadColor(); } else { //Entered stuff, remove coloring return GetControlDefaultColor(); } } private Color GetControlRequiredColor() { return ControlRequired; } private Color GetControlBadColor() { return ControlBad; } private Color GetControlGoodColor() { return ControlGood; } //am i doin it rite i've obfuscated the code, but it's still altogether. The next step in the MVC obfuscation, i gather, is to hide the code in 3 or 4 different files. It's that next step that i don't understand. What is the logical separation of which functions are moved into what other classes? Can someone translate my 3 simple examples above into full fledged MVC obfuscation? Edit: Not ASP/ASP.NET/Online. Pretend it's on a desktop, handheld, surface, kiosk. And pretend it's language agnostic.

    Read the article

  • How to fix Solr - Server is shutting down issue?

    - by Krunal
    I was having a running Solr 4.1 on Windows Server 2008 R2. The Solr is deployed on Tomcat. However, today it stops suddenly, and while accessing Solr it gives following error. HTTP Status 503 - Server is shutting down type Status report message Server is shutting down description The requested service is not currently available. On further looking into Logs, we got following: Log File: tomcat7-stderr.2013-05-09.txt May 09, 2013 8:00:40 PM org.apache.solr.core.CoreContainer finalize SEVERE: CoreContainer was not shutdown prior to finalize(), indicates a bug -- POSSIBLE RESOURCE LEAK!!! instance=2221663 Log File: catalina.2013-05-09.txt May 09, 2013 7:59:25 PM org.apache.solr.core.SolrResourceLoader <init> INFO: new SolrResourceLoader for directory: 'c:\solrdir\' May 09, 2013 7:59:29 PM org.apache.solr.common.SolrException log SEVERE: Exception during parsing file: null:org.xml.sax.SAXParseException; systemId: file:/c:/solr/solr.xml; lineNumber: 2; columnNumber: 6; The processing instruction target matching "[xX][mM][lL]" is not allowed. at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source) at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.fatalError(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLScanner.reportFatalError(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLScanner.scanPIData(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanPIData(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLScanner.scanPI(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(Unknown Source) at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(Unknown Source) at org.apache.solr.core.Config.<init>(Config.java:121) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:428) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:404) at org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:336) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:98) at org.apache.catalina.core.ApplicationFilterConfig.initFilter(ApplicationFilterConfig.java:281) at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:262) at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:107) at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4656) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5309) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150) at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901) at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877) at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:633) at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:977) at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1655) at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source) at java.util.concurrent.FutureTask.run(Unknown Source) at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.lang.Thread.run(Unknown Source) May 09, 2013 7:59:29 PM org.apache.solr.servlet.SolrDispatchFilter init SEVERE: Could not start Solr. Check solr/home property and the logs May 09, 2013 7:59:29 PM org.apache.solr.common.SolrException log SEVERE: null:org.apache.solr.common.SolrException: at org.apache.solr.core.CoreContainer.load(CoreContainer.java:431) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:404) at org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:336) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:98) at org.apache.catalina.core.ApplicationFilterConfig.initFilter(ApplicationFilterConfig.java:281) at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:262) at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:107) at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4656) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5309) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150) at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901) at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877) at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:633) at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:977) at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1655) at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source) at java.util.concurrent.FutureTask.run(Unknown Source) at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.lang.Thread.run(Unknown Source) Caused by: org.xml.sax.SAXParseException; systemId: file:/c:/solrdir/solr.xml; lineNumber: 2; columnNumber: 6; The processing instruction target matching "[xX][mM][lL]" is not allowed. at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source) at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.fatalError(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLScanner.reportFatalError(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLScanner.scanPIData(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanPIData(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLScanner.scanPI(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(Unknown Source) at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(Unknown Source) at org.apache.solr.core.Config.<init>(Config.java:121) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:428) ... 20 more May 09, 2013 7:59:29 PM org.apache.solr.servlet.SolrDispatchFilter init INFO: SolrDispatchFilter.init() done May 09, 2013 7:59:29 PM org.apache.catalina.startup.HostConfig deployDirectory INFO: Deploying web application directory C:\Program Files (x86)\Apache Software Foundation\Tomcat 7.0\webapps\docs May 09, 2013 7:59:30 PM org.apache.catalina.startup.HostConfig deployDirectory INFO: Deploying web application directory C:\Program Files (x86)\Apache Software Foundation\Tomcat 7.0\webapps\manager May 09, 2013 7:59:30 PM org.apache.catalina.startup.HostConfig deployDirectory INFO: Deploying web application directory C:\Program Files (x86)\Apache Software Foundation\Tomcat 7.0\webapps\ROOT May 09, 2013 7:59:30 PM org.apache.coyote.AbstractProtocol start INFO: Starting ProtocolHandler ["http-bio-8983"] May 09, 2013 7:59:30 PM org.apache.coyote.AbstractProtocol start INFO: Starting ProtocolHandler ["ajp-bio-8009"] May 09, 2013 7:59:30 PM org.apache.catalina.startup.Catalina start INFO: Server startup in 9578 ms May 09, 2013 8:00:40 PM org.apache.solr.core.CoreContainer finalize SEVERE: CoreContainer was not shutdown prior to finalize(), indicates a bug -- POSSIBLE RESOURCE LEAK!!! instance=2221663 Any idea what could be wrong and how to fix?

    Read the article

  • apache2 how to trace caller of SIGTERM

    - by art vanderlay
    I have a dex x64 on a virtualbox win7pro host. My apache2 will stop responding after a page request or other activity such as upload via ftp. The php.cgi becomes non responsive and a restart is required any help tracking down the culprit sending the SIGTERM would be much appreciated. thx Art my apache2.conf has <IfModule mpm_prefork_module> ServerLimit 1024 StartServers 10 MinSpareServers 10 MaxSpareServers 20 MaxClients 1024 MaxRequestsPerChild 0 </IfModule> ` From the apache2 log I have [Wed Jun 20 05:07:01 2012] [notice] caught SIGTERM, shutting down [Wed Jun 20 05:07:03 2012] [notice] FastCGI: process manager initialized (pid 4369) [Wed Jun 20 05:07:03 2012] [notice] Apache/2.2.16 (Debian) mod_fastcgi/2.4.6 PHP/5.3.3-7+squeeze13 with Suhosin-Patch mod_perl/2.0.4 Perl/v5.10.1 configured -- resuming normal operations and from the accounting output with lastcomm php.cgi www-data __ 0.13 secs Wed Jun 20 04:49 lastcomm root pts/2 0.10 secs Wed Jun 20 04:49 php.cgi www-data __ 0.18 secs Wed Jun 20 04:49 php.cgi www-data __ 0.18 secs Wed Jun 20 04:47 apache2 root pts/1 0.02 secs Wed Jun 20 04:46 tput root pts/1 0.00 secs Wed Jun 20 04:46 apache2 F root pts/1 0.00 secs Wed Jun 20 04:46 apache2ctl root pts/1 0.00 secs Wed Jun 20 04:46 apache2 S root pts/1 0.77 secs Wed Jun 20 04:46 rm root pts/1 0.01 secs Wed Jun 20 04:46 install root pts/1 0.01 secs Wed Jun 20 04:46 mkdir root pts/1 0.00 secs Wed Jun 20 04:46 apache2ctl F root pts/1 0.00 secs Wed Jun 20 04:46 sleep root pts/1 0.00 secs Wed Jun 20 04:46 apache2 SF root __ 0.54 secs Wed Jun 20 04:34 apache2 SF www-data __ 0.14 secs Wed Jun 20 04:34 apache2 SF www-data __ 0.07 secs Wed Jun 20 04:34 apache2 SF www-data __ 0.06 secs Wed Jun 20 04:36 apache2 SF www-data __ 0.07 secs Wed Jun 20 04:34 apache2 SF www-data __ 0.11 secs Wed Jun 20 04:34 apache2 SF www-data __ 0.02 secs Wed Jun 20 04:34 apache2 SF www-data __ 0.04 secs Wed Jun 20 04:34 apache2 SF www-data __ 0.06 secs Wed Jun 20 04:34 apache2 SF www-data __ 0.08 secs Wed Jun 20 04:34 apache2 SF www-data __ 0.03 secs Wed Jun 20 04:34 apache2 SF www-data __ 0.02 secs Wed Jun 20 04:34 apache2 SF www-data __ 0.01 secs Wed Jun 20 04:34 grep root pts/1 0.00 secs Wed Jun 20 04:46 apache2ctl root pts/1 0.02 secs Wed Jun 20 04:46 apache2 root pts/1 0.24 secs Wed Jun 20 04:46 apache2 SF www-data __ 0.00 secs Wed Jun 20 04:34 apache2ctl F root pts/1 0.00 secs Wed Jun 20 04:46 apache2ctl root pts/1 0.00 secs Wed Jun 20 04:46 apache2 root pts/1 0.22 secs Wed Jun 20 04:46 apache2ctl F root pts/1 0.01 secs Wed Jun 20 04:46 apache2 F root pts/1 0.00 secs Wed Jun 20 04:46 grep root pts/1 0.00 secs Wed Jun 20 04:46 tr root pts/1 0.00 secs Wed Jun 20 04:46 pidof S root pts/1 0.11 secs Wed Jun 20 04:46 cat root pts/1 0.00 secs Wed Jun 20 04:46 apache2 F root pts/1 0.00 secs Wed Jun 20 04:46 grep root pts/1 0.00 secs Wed Jun 20 04:46 tr root pts/1 0.00 secs Wed Jun 20 04:46 pidof S root pts/1 0.05 secs Wed Jun 20 04:46 cat root pts/1 0.01 secs Wed Jun 20 04:46 apache2 F root pts/1 0.00 secs Wed Jun 20 04:46 apache2ctl root pts/1 0.00 secs Wed Jun 20 04:46 apache2 root pts/1 0.34 secs Wed Jun 20 04:46 apache2ctl F root pts/1 0.00 secs Wed Jun 20 04:46 apache2 F root pts/1 0.00 secs Wed Jun 20 04:46 apache2 F root pts/1 0.00 secs Wed Jun 20 04:46 smbd SF root __ 0.25 secs Wed Jun 20 04:46 php.cgi www-data __ 0.14 secs Wed Jun 20 04:45 php.cgi www-data __ 0.19 secs Wed Jun 20 04:42 cron SF root __ 0.02 secs Wed Jun 20 04:39 sh S root __ 0.00 secs Wed Jun 20 04:39 find root __ 0.00 secs Wed Jun 20 04:39 maxlifetime root __ 0.02 secs Wed Jun 20 04:39 php5 root __ 0.13 secs Wed Jun 20 04:39 which root __ 0.00 secs Wed Jun 20 04:39 exim4 S root __ 0.01 secs Wed Jun 20 04:37 php.cgi www-data __ 0.04 secs Wed Jun 20 04:36 php.cgi www-data __ 0.12 secs Wed Jun 20 04:35 php.cgi www-data __ 0.11 secs Wed Jun 20 04:35 php.cgi www-data __ 0.14 secs Wed Jun 20 04:34 lastcomm root pts/2 0.09 secs Wed Jun 20 04:34 apache2 root pts/1 0.02 secs Wed Jun 20 04:34 tput root pts/1 0.00 secs Wed Jun 20 04:34 apache2 F root pts/1 0.00 secs Wed Jun 20 04:34 apache2ctl root pts/1 0.00 secs Wed Jun 20 04:34 apache2 S root pts/1 0.54 secs Wed Jun 20 04:34 rm root pts/1 0.00 secs Wed Jun 20 04:34 install root pts/1 0.00 secs Wed Jun 20 04:34 mkdir root pts/1 0.00 secs Wed Jun 20 04:34 apache2ctl F root pts/1 0.00 secs Wed Jun 20 04:34 sleep root pts/1 0.00 secs Wed Jun 20 04:34 apache2 SF root __ 0.80 secs Wed Jun 20 03:58 sleep root pts/1 0.00 secs Wed Jun 20 04:34 apache2 SF www-data __ 0.26 secs Wed Jun 20 03:58 apache2 SF www-data __ 0.12 secs Wed Jun 20 03:59 apache2 SF www-data __ 0.13 secs Wed Jun 20 03:58 apache2 SF www-data __ 0.13 secs Wed Jun 20 03:59 apache2 SF www-data __ 0.15 secs Wed Jun 20 03:58 apache2 SF www-data __ 0.18 secs Wed Jun 20 03:58 apache2 SF www-data __ 0.07 secs Wed Jun 20 04:21 apache2 SF www-data __ 0.18 secs Wed Jun 20 03:58 apache2 SF www-data __ 0.17 secs Wed Jun 20 03:58 apache2 SF www-data __ 0.30 secs Wed Jun 20 03:58 apache2 SF www-data __ 0.09 secs Wed Jun 20 03:58 apache2 SF www-data __ 0.02 secs Wed Jun 20 04:13

    Read the article

< Previous Page | 548 549 550 551 552 553 554 555 556 557 558 559  | Next Page >