Search Results

Search found 19555 results on 783 pages for 'job performance'.

Page 247/783 | < Previous Page | 243 244 245 246 247 248 249 250 251 252 253 254  | Next Page >

  • The C++ Standard Template Library as a BDB Database (part 1)

    - by Gregory Burd
    If you've used C++ you undoubtedly have used the Standard Template Libraries. Designed for in-memory management of data and collections of data this is a core aspect of all C++ programs. Berkeley DB is a database library with a variety of APIs designed to ease development, one of those APIs extends and makes use of the STL for persistent, transactional data storage. dbstl is an STL standard compatible API for Berkeley DB. You can make use of Berkeley DB via this API as if you are using C++ STL classes, and still make full use of Berkeley DB features. Being an STL library backed by a database, there are some important and useful features that dbstl can provide, while the C++ STL library can't. The following are a few typical use cases to use the dbstl extensions to the C++ STL for data storage. When data exceeds available physical memory.Berkeley DB dbstl can vastly improve performance when managing a dataset which is larger than available memory. Performance suffers when the data can't reside in memory because the OS is forced to use virtual memory and swap pages of memory to disk. Switching to BDB's dbstl improves performance while allowing you to keep using STL containers. When you need concurrent access to C++ STL containers.Few existing C++ STL implementations support concurrent access (create/read/update/delete) within a container, at best you'll find support for accessing different containers of the same type concurrently. With the Berkeley DB dbstl implementation you can concurrently access your data from multiple threads or processes with confidence in the outcome. When your objects are your database.You want to have object persistence in your application, and store objects in a database, and use the objects across different runs of your application without having to translate them to/from SQL. The dbstl is capable of storing complicated objects, even those not located on a continous chunk of memory space, directly to disk without any unnecessary overhead. These are a few reasons why you should consider using Berkeley DB's C++ STL support for your embedded database application. In the next few blog posts I'll show you a few examples of this approach, it's easy to use and easy to learn.

    Read the article

  • CodePlex Daily Summary for Thursday, December 09, 2010

    CodePlex Daily Summary for Thursday, December 09, 2010Popular ReleasesAutoLoL: AutoLoL v1.4.3: AutoLoL now supports importing the build pages from Mobafire.com as well! Just insert the url to the build and voila. (For example: http://www.mobafire.com/league-of-legends/build/unforgivens-guide-how-to-build-a-successful-mordekaiser-24061) Stable release of AutoChat (It is still recommended to use with caution and to read the documentation) It is now possible to associate *.lolm files with AutoLoL to quickly open them The selected spells are now displayed in the masteries tab for qu...SubtitleTools: SubtitleTools 1.2: - Added auto insertion of RLE (RIGHT-TO-LEFT EMBEDDING) Unicode character for the RTL languages. - Fixed delete rows issue.PHP Manager for IIS: PHP Manager 1.1 for IIS 7: This is a final stable release of PHP Manager 1.1 for IIS 7. This is a minor incremental release that contains all the functionality available in 53121 plus additional features listed below: Improved detection logic for existing PHP installations. Now PHP Manager detects the location to php.ini file in accordance to the PHP specifications Configuring date.timezone. PHP Manager can automatically set the date.timezone directive which is required to be set starting from PHP 5.3 Ability to ...Algorithmia: Algorithmia 1.1: Algorithmia v1.1, released on December 8th, 2010.SuperSocket, an extensible socket application framework: SuperSocket 1.0 SP1: Fixed bugs: fixed a potential bug that the running state hadn't been updated after socket server stopped fixed a synchronization issue when clearing timeout session fixed a bug in ArraySegmentList fixed a bug on getting configuration valueCslaGenFork: CslaGenFork 4.0 CTP 2: The version is 4.0.1 CTP2 and was released 2010 December 7 and includes the following files: CslaGenFork 4.0.1-2010-12-07 Setup.msi Templates-2010-10-07.zip For getting started instructions, refer to How to section. Overview of the changes Since CTP1 there were 53 work items closed (28 features, 24 issues and 1 task). During this 60 days a lot of work has been done on several areas. First the stereotypes: EditableRoot is OK EditableChild is OK EditableRootCollection is OK Editable...Windows Workflow Foundation on Codeplex: WF AppFabric Caching Activity Pack 0.1: This release includes a set of AppFabric Caching Activities that allow you to use Windows Server AppFabric Caching with WF4. Video endpoint.tv - New WF4 Caching Activities for Windows Server AppFabric ActivitiesDataCacheAdd DataCacheGet DataCachePut DataCacheGet DataCacheRemove WaitForCacheBulkNotification WaitForCacheNotification WaitForFailureNotification WaitForItemNotification WaitForRegionNotification Unit TestsUnit tests are included in the source. Be sure to star...My Web Pages Starter Kit: 1.3.1 Production Release (Security HOTFIX): Due to a critical security issue, it's strongly advised to update the My Web Pages Starter Kit to this version. Possible attackers could misuse the image upload to transmit any type of file to the website. If you already have a running version of My Web Pages Starter Kit 1.3.0, you can just replace the ftb.imagegallery.aspx file in the root directory with the one attached to this release.EnhSim: EnhSim 2.2.0 ALPHA: 2.2.0 ALPHAThis release adds in the changes for 4.03a. at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Updated En...ASP.NET MVC Project Awesome (jQuery Ajax helpers): 1.4: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager new stuff: popup WhiteSpaceFilterAttribute tested on mozilla, safari, chrome, opera, ie 9b/8/7/6nopCommerce. ASP.NET open source shopping cart: nopCommerce 1.90: To see the full list of fixes and changes please visit the release notes page (http://www.nopCommerce.com/releasenotes.aspx).TweetSharp: TweetSharp v2.0.0.0 - Preview 4: Documentation for this release may be found at http://tweetsharp.codeplex.com/wikipage?title=UserGuide&referringTitle=Documentation. Note: This code is currently preview quality. Preview 4 ChangesReintroduced fluent interface support via satellite assembly Added entities support, entity segmentation, and ITweetable/ITweeter interfaces for client development Numerous fixes reported by preview users Preview 3 ChangesNumerous fixes and improvements to core engine Twitter API coverage: a...Aura: Aura Preview 1: Rewritten from scratch. This release supports getting color only from icon of foreground window.MBG Extensions Library: MBG.Extensions_v1.3: MBG.Extensions Collections.CollectionExtensions - AddIfNew - RemoveRange (Moved From ListExtensions to here, where it should have been) Collections.EnumerableExtensions - ToCommaSeparatedList has been replaced by: Join() and ToValueSeparatedList Join is for a single line of values. ToValueSeparatedList is generally for collection and will separate each entity in the collection by a new line character - ToQueue - ToStack Core.ByteExtensions - TripleDESDecrypt Core.DateTimeExtension...myCollections: Version 1.2: New in version 1.2: Big performance improvement. New Design (Added Outlook style View, New detail view, New Groub By...) Added Sort by Media Added Manage Movie Studio Zoom preference is now saved. Media name are now editable. Added Portuguese version You can now Hide details panel Add support for FLAC tags You can now imports books from BibTex Xml file BugFixingmytrip.mvc (CMS & e-Commerce): mytrip.mvc 1.0.49.0 beta: mytrip.mvc 1.0.49.0 beta web Web for install hosting System Requirements: NET 4.0, MSSQL 2008 or MySql (auto creation table to database) if .\SQLEXPRESS auto creation database (App_Data folder) mytrip.mvc 1.0.49.0 beta src System Requirements: Visual Studio 2010 or Web Deweloper 2010 MSSQL 2008 or MySql (auto creation table to database) if .\SQLEXPRESS auto creation database (App_Data folder) Connector/Net 6.3.4, MVC3 RC WARNING For run and debug mytrip.mvc 1.0.49.0 beta src download and ...Menu and Context Menu for Silverlight 4.0: Silverlight Menu and Context Menu v2.3 Beta: - Added keyboard navigation support with access keys - Shortcuts like Ctrl-Alt-A are now supported(where the browser permits it) - The PopupMenuSeparator is now completely based on the PopupMenuItem class - Moved item manipulation code to a partial class in PopupMenuItemsControl.cs - Moved menu management and keyboard navigation code to the new PopupMenuManager class - Simplified the layout by removing the RootGrid element(all content is now placed in OverlayCanvas and is accessed by the new ...MiniTwitter: 1.62: MiniTwitter 1.62 ???? ?? ??????????????????????????????????????? 140 ?????????????????????????? ???????????????????????????????? ?? ??????????????????????????????????Phalanger - The PHP Language Compiler for the .NET Framework: 2.0 (December 2010): The release is targetted for stable daily use. With improved performance and enhanced compatibility with several latest PHP open source applications; it makes this release perfect replacement of your old PHP runtime. Changes made within this release include following and much more: Performance improvements based on real-world applications experience. We determined biggest bottlenecks and we found and removed overheads causing performance problems in many PHP applications. Reimplemented nat...Chronos WPF: Chronos v2.0 Beta 3: Release notes: Updated introduction document. Updated Visual Studio 2010 Extension (vsix) package. Added horizontal scrolling to the main window TaskBar. Added new styles for ListView, ListViewItem, GridViewColumnHeader, ... Added a new WindowViewModel class (allowing to fetch data). Added a new Navigate method (with several overloads) to the NavigationViewModel class (protected). Reimplemented Task usage for the WorkspaceViewModel.OnDelete method. Removed the reflection effect...New Projects:WinK: WinK Project1000 bornes: This project is the adaptation of the famous French card game 1000 bornes (http://en.wikipedia.org/wiki/Mille_Bornes) There will be 3 types of clients: - Windows Application (WPF) - Internet Application (ASP.NET, Ajax) - Silverlight Application It's developed in C#.AutomaTones: BDSA Project 2010. Team Anders is developing an application that uses automatons to generate music.EIRENE: UnknownFinal: TDD driven analize of avalable tdd frameworks ect.HomeGrown Database Project tools: A set of tools that can be used to deploy Visual Studio SQL Databse and Server Projects. Developed using Visual Basic .Net 4.0mcssolution: no summaryMobile-enabled ASP.NET Web Forms / MVC application samples: Code samples for the whitepaper "Add mobile pages to your ASP.NET Web Forms / MVC application" linked from http://asp.net/mobileMSDI Projects: www.msdi.cnObject TreeView Visualizer: This is a Helper Library for easy Access to Visual a Object to an treeview. Nice feature to display data, if an error happen. One Place To Rule Them All: Desktop system to manage basics system functions in 3d environmant. Optra also provide community support and easy transfer data and setups between varius devices using xmpp protocol and OpenFire jabber server.Performance Data Suite: The Performance Data Suite will help you to monitor, analyze and optimize your server infrastructure. There will be predefined sets of data collections(e.g. MySQL, Apache, IIS) but it will also help you to create collections on your own.Secure Group Communication in AdHoc Networks: Secure Group Communication in AdHoc Networksimweb: simweb - is a research project which own by GCR and all its copyright belong to GCR. You can download the code for reference only but not able to be commercial without a fees.starLiGHT.Engine: starLiGHT.Engine is a set of libraries for indie game developers using XNA. It is in development for some years now as a closed source project. Now I will release some (most) parts as Open Source (dual licensing).UMC? ???? .NET ??? ?? ???? ???: ???(Junil, Um)? ???? .NET ???? ?? ?? ???? ??? ???.University of Ottawa tour for WP7: This is a Windows Phone 7 tour guide app for the University of Ottawa. vutpp for VS2010: C++ UnitTest Gui Addin????: ????

    Read the article

  • SQL SERVER – 2008 – Missing Index Script – Download

    - by pinaldave
    Download Missing Index Script with Unused Index Script Performance Tuning is quite interesting and Index plays a vital role in it. A proper index can improve the performance and a bad index can hamper the performance. Here is the script from my script bank which I use to identify missing indexes on any database. Please note, if you should not create all the missing indexes this script suggest. This is just for guidance. You should not create more than 5-10 indexes per table. Additionally, this script sometime does not give accurate information so use your common sense. Any way, the scripts is good starting point. You should pay attention to Avg_Estimated_Impact when you are going to create index. The index creation script is also provided in the last column. Download Missing Index Script with Unused Index Script -- Missing Index Script -- Original Author: Pinal Dave (C) 2011 SELECT TOP 25 dm_mid.database_id AS DatabaseID, dm_migs.avg_user_impact*(dm_migs.user_seeks+dm_migs.user_scans) Avg_Estimated_Impact, dm_migs.last_user_seek AS Last_User_Seek, OBJECT_NAME(dm_mid.OBJECT_ID,dm_mid.database_id) AS [TableName], 'CREATE INDEX [IX_' + OBJECT_NAME(dm_mid.OBJECT_ID,dm_mid.database_id) + '_' + REPLACE(REPLACE(REPLACE(ISNULL(dm_mid.equality_columns,''),', ','_'),'[',''),']','') + CASE WHEN dm_mid.equality_columns IS NOT NULL AND dm_mid.inequality_columns IS NOT NULL THEN '_' ELSE '' END + REPLACE(REPLACE(REPLACE(ISNULL(dm_mid.inequality_columns,''),', ','_'),'[',''),']','') + ']' + ' ON ' + dm_mid.statement + ' (' + ISNULL (dm_mid.equality_columns,'') + CASE WHEN dm_mid.equality_columns IS NOT NULL AND dm_mid.inequality_columns IS NOT NULL THEN ',' ELSE '' END + ISNULL (dm_mid.inequality_columns, '') + ')' + ISNULL (' INCLUDE (' + dm_mid.included_columns + ')', '') AS Create_Statement FROM sys.dm_db_missing_index_groups dm_mig INNER JOIN sys.dm_db_missing_index_group_stats dm_migs ON dm_migs.group_handle = dm_mig.index_group_handle INNER JOIN sys.dm_db_missing_index_details dm_mid ON dm_mig.index_handle = dm_mid.index_handle WHERE dm_mid.database_ID = DB_ID() ORDER BY Avg_Estimated_Impact DESC GO Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Download, SQL Index, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Optimize SUMMARIZE with ADDCOLUMNS in Dax #ssas #tabular #dax #powerpivot

    - by Marco Russo (SQLBI)
    If you started using DAX as a query language, you might have encountered some performance issues by using SUMMARIZE. The problem is related to the calculation you put in the SUMMARIZE, by adding what are called extension columns, which compute their value within a filter context defined by the rows considered in the group that the SUMMARIZE uses to produce each row in the output. Most of the time, for simple table expressions used in the first parameter of SUMMARIZE, you can optimize performance by removing the extended columns from the SUMMARIZE and adding them by using an ADDCOLUMNS function. In practice, instead of writing SUMMARIZE( <table>, <group_by_column>, <column_name>, <expression> ) you can write: ADDCOLUMNS(     SUMMARIZE( <table>, <group by column> ),     <column_name>, CALCULATE( <expression> ) ) The performance difference might be huge (orders of magnitude) but this optimization might produce a different semantic and in these cases it should not be used. A longer discussion of this topic is included in my Best Practices Using SUMMARIZE and ADDCOLUMNS article on SQLBI, which also include several details about the DAX syntax with extended columns. For example, did you know that you can create an extended column in SUMMARIZE and ADDCOLUMNS with the same name of existing measures? It is *not* a good thing to do, and by reading the article you will discover why. Enjoy DAX!

    Read the article

  • Great Blogs About Oracle Solaris 11

    - by Markus Weber
    Now that Oracle Solaris 11 has been released, why not blog about blogs. There is of course a tremendous amount of resource and information available, but valuable insights directly from people actually building the product is priceless. Here's a list of such great blogs. NOTE: If you think we missed some good ones, please let us know in the comments section !  Topic Title Author Top 11 Things My 11 favourite Solaris 11 features Darren Moffat Top 11 Things These are 11 of my favorite things! Mike Gerdts Top 11 Things 11 reason to love Solaris 11     Jim Laurent SysAdmin Resources Solaris 11 Resources for System Administrators Rick Ramsey Overview Oracle Solaris 11: The First Cloud OS Larry Wake Overview What's a "Cloud Operating System"? Harry Foxwell Overview What's New in Oracle Solaris 11 Jeff Victor Try it ! Virtually the fastest way to try Solaris 11 (and Solaris 10 zones) Dave Miner Upgrade Upgrading Solaris 11 Express b151a with support to Solaris 11 Alan Hargreaves IPS The IPS System Repository Tim Foster IPS Building a Solaris 11 repository without network connection Jim Laurent IPS IPS Self-assembly – Part 1: overlays Tim Foster IPS Self assembly – Part 2: multiple packages delivering configuration Tim Foster Security Immutable Zones on Encrypted ZFS Darren Moffat Security User home directory encryption with ZFS Darren Moffat Security Password (PAM) caching for Solaris su - "a la sudo" Darren Moffat Security Completely disabling root logins on Solaris 11 Darren Moffat Security OpenSSL Version in Solaris Darren Moffat Security Exciting Crypto Advances with the T4 processor and Oracle Solaris 11 Valerie Fenwick Performance Critical Threads Optimization Rafael Vanoni Performance SPARC T4-2 Delivers World Record SPECjvm2008 Result with Oracle Solaris 11 BestPerf Blog Performance Recent Benchmarks Using Oracle Solaris 11 BestPerf Blog Predictive Self Healing Introducing SMF Layers Sean Wilcox Predictive Self Healing Oracle Solaris 11 - New Fault Management Features Gavin Maltby Desktop What's new on the Solaris 11 Desktop? Calum Benson Desktop S11 X11: ye olde window system in today's new operating system Alan Coopersmith Desktop Accessible Oracle Solaris 11 - released! Peter Korn

    Read the article

  • How to penetrate the QA industry after layoffs, next steps...

    - by Erik
    Briefly, my background is in manual black box testing of websites and applications within the Agile/waterfall context. Over the past four years I was a member of two web development firms' small QA teams dedicated to testing the deployment of websites for national/international non profits, governmental organizations, and for profit business, to name a few: -Brookings Institution -Senate -Tyco Electronics -Blue Cross/Blue Shield -National Geographic -Discover Channel I have a very strong understanding of the: -SDLC -STLC of bugs and website deployment/development -Use Case & Test Case development In March of this year, my last firm downsized and lost my job as a QA tester. I have been networking and doing a very detailed job search, but have had a very difficult time getting my next job within the QA industry, even with my background as a manual black box QA tester in the website development context. My direct question to all of you: What are some ways I can be more competitive and get hired? Options that could get me competitive: Should I go back to school and learn some more 'hard' skills in website development and client side technologies, e.g.: -HTML -CSS -JavaScript Learn programming: -PHP -C# -Ruby -SQL -Python -Perl -?? Get Certified as a QA Tester, there are a countless numbers of programs to become a Certified Tester. Most, if not all jobs, being advertised now require Automated Testing experience, in: -QTP -Loadrunner -Selenium -ETC. Should I learn, Automated testing skills, via a paid course, or teach myself? --Learn scripting languages to understand the automated testing process better? Become a Certified "Project Management Professional" (PMP) to prove to hiring managers that I 'get' the project development life cycle? At the end of the day I need to be competitive and get hired as a QA tester and want to build upon my skills within the QA web development field. How should I do this, without reinventing the wheel? Any help in this regard would be fabulous. Thanks! .erik

    Read the article

  • links for 2010-04-13

    - by Bob Rhubart
    Frederic Michiar: Manage a flexible and elastic Data Center with Oracle VM Manager Frederic Michiar shares a list of Oracle VM resources. (tags: otn oracle virtualization) Mona Rakibe: BAM Data Control in multiple ADF Faces Components "When two or more ADF Faces components must display the same data, and are bound to the same Oracle BAM data control definition, we have to make sure that we wrap each ADF Faces component in an ADF task flow, and set the Data Control Scope to isolated. " Mona Rakibe shows you how. (tags: oracle otn soa bam adf) Martin Widlake: Performance Tipping Points Martin Widlake offers "a nice example of a performance tipping point. This is where Everything is OK until you reach a point where it all quickly cascades to Not OK." (tags: oracle otn database architecture performance) Steve Chan: EBS Techstack Sessions at OAUG/Collaborate 2010 Steve Chan shares a list of Collaborate 2010 sessions featuring Oracle E-Business Suite Applications Technology Group staffers. (tags: oracle otn collaborate2010 ebs) @ORACLENERD: Developing in APEX Oracle ACE Chet Justice counts the ways... (tags: otn oracle oracleace apex) @bex: Almost Time For IOUG Collaborate 2010 Oracle ACE Director Bex Huff shares details on his Collaborate 2010 presentation, "The Top 10 Things Oracle UCM Customers Need To Know About WebLogic:" (tags: oracle otn oracleace collaborate2010 weblogic ucm enterprise2.0)

    Read the article

  • MySQL Connect Content Catalog Live

    - by Bertrand Matthelié
    The MySQL Connect Content Catalog is now live and you can check out the great program the content committee put together for you. We received a lot of very good submissions during the call for papers and we’d like to thank you all again for those, it was a very difficult job to choose. Overall MySQL Connect will in two days include: Keynotes, with speakers such as Oracle Chief Corporate Architect Edward Screven and Vice President of MySQL Engineering Tomas Ulin 66 conference sessions, enabling you to hear from: Oracle engineers on MySQL 5.6 new features, InnoDB, performance and scalability, security, NoSQL, MySQL Cluster…and more MySQL users and customers including Facebook, Twitter, PayPal, Yahoo, Ticketmaster, and CERN Internationally recognized MySQL community members and partners on topics such as performance, security or high availability 6 Birds-of-a-feather sessions, in which you’ll be able to engage into passionate discussions about replication, backup and other subjects, and help influence the MySQL roadmap 8 Hands-On Labs designed to give you hands-on experience about MySQL replication, MySQL Cluster, the MySQL Performance Schema…and more Demo pods about MySQL Workbench, MySQL Cluster, MySQL Enterprise Edition and other technologies and services We’ll also have networking receptions on both Saturday and Sunday evening, enabling you to discuss with the Oracle engineers developing and supporting the MySQL products, as well as with other users and customers. Additionally, you’ll have the opportunity to meet and learn from our partners in the exhibition hall. Some of the MySQL Connect speakers such as Henrik Ingo and Andrew Morgan have already blogged about their presence at MySQL Connect, and you can find more information about their sessions or their thoughts about the conference in their blogs. We also published an interview with Tomas Ulin a few weeks ago. In summary, don’t miss MySQL Connect! And you only have about 3 weeks left to register with the early bird discount and save US$500. Don’t wait, Register Now! Interested in sponsorship and exhibit opportunities? You will find more information here.

    Read the article

  • Java, the Cloud, and Oracle at QCon San Francisco 2011

    - by Bob Rhubart
    If you're part of the lucky bunch attending this week's sold-out QCon San Francisco conference at Westin San Francisco Market Street, I'd like to bring several sessions to your attention. On Wednesday Nov 16, Alex Buckley, specification lead for the Java Language and the Java Virtual Machine at Oracle, will present Java 7 and 8: Where We've Been, Where We're Going, part of the Why is Java still sexy? track. The session begins at 10:35 a.m. in the Olympic room. On Thursday Nov 17, Tyler Jewell, VP Product Management for Oracle's Platform as a Service, will participate in the Performance and Scalability Panel moderated by InfoQ founder and QCon SF Program Committee Member Floyd Marinescu. That panel, part of the Performance and Scalability Solutions track, begins at 10:35 a.m. in the Olympic room. Following that panel discussion, Tyler will fly solo with a presentation on Java EE 7: Developing for the Cloud, also part of the Performance and Scalability Solutions track. That session kicks off at 12:05 p.m., also in the Olympic room. On Friday Nov 18 Tyler will jump tracks, so to speak, when he presents The Architecture of Oracle's Public Cloud, part of the Architecture Case Studies: Cloud track. That session begins at 4:50 p.m. in the Stanford room. Of course, QCon also offers ample meet-and-greet opportunities. One such opportunity happens in the hospitality suite hosted by the Java Community Process Executive Committee. That shindig gets in gear at 5:50 pm on Thursday. Throughout the QCon San Francisco conference, members of the OTN team (including your's truly) and members of the Oracle Fusion Middleware team will be on hand at the OTN booth in the conference lobby. Stop by to say hello, score some swag, and catch a demo or two.

    Read the article

  • You Need BRM When You have EBS – and Even When You Don’t!

    - by bwalstra
    Here is a list of criteria to test your business-systems (Oracle E-Business Suite, EBS) or otherwise to support your lines of digital business - if you score low, you need Oracle Billing and Revenue Management (BRM). Functions Scalability High Availability (99.999%) Performance Extensibility (e.g. APIs, Tools) Upgradability Maintenance Security Standards Compliance Regulatory Compliance (e.g. SOX) User Experience Implementation Complexity Features Customer Management Real-Time Service Authorization Pricing/Promotions Flexibility Subscriptions Usage Rating and Pricing Real-Time Balance Mgmt. Non-Currency Resources Billing & Invoicing A/R & G/L Payments & Collections Revenue Assurance Integration with Key Enterprise Applications Reporting Business Intelligence Order & Service Mgmt (OSM) Siebel CRM E-Business Suite On-/Off-line Mediation Payment Processing Taxation Royalties & Settlements Operations Management Disaster Recovery Overall Evaluation Implementation Configuration Extensibility Maintenance Upgradability Functional Richness Feature Richness Usability OOB Integrations Operations Management Leveraging Oracle Technology Overall Fit for Purpose You need Oracle BRM: Built for high-volume transaction processing Monetizes any service or event based on any metric Supports high-volume usage rating, pricing and promotions Provides real-time charging, service authorization and balance management Supports any account structure (e.g. corporate hierarchies etc.) Scales from low volumes to extremely high volumes of transactions (e.g. billions of trxn per hour) Exposes every single function via APIs (e.g. Java, C/C++, PERL, COM, Web Services, JCA) Immediate Business Benefits of BRM: Improved business agility and performance Supports the flexibility, innovation, and customer-centricity required for current and future business models Faster time to market for new products and services Supports 360 view of the customer in real-time – products can be launched to targeted customers at a record-breaking pace Streamlined deployment and operation Productized integrations, standards-based APIs, and OOB enablement lower deployment and maintenance costs Extensible and scalable solution Minimizes risk – initial phase deployed rapidly; solution extended and scaled seamlessly per business requirements Key Considerations Productized integration with key Oracle applications Lower integration risks and cost Efficient order-to-cash process Engineered solution – certification on Exa platform Exadata tested at PayPal in the re-platforming project Optimal performance of Oracle assets on Oracle hardware Productized solution in Rapid Offer Design and Order Delivery Fast offer design and implementation Significantly shorter order cycle time Productized integration with Oracle Enterprise Manager Visibility to system operability for optimal up time

    Read the article

  • Role of Sharepoint experience in career growth

    - by Syed Ibrahim
    I am from India. I was a Mainframe developer for first 2 years of my IT career and then shifted to Microsoft .Net and completed 3 years as of now. In these 3 years as a .Net developer i have worked only in core .Net skills like Asp .Net, Sql Server with C# .Net. I never worked in advance skills like Web Services or WCF or silverlight etc. In current world market scenario, I feel Sharepoint experience weighs more than the WCF, Web Services work experience for a .Net Developer.(Please correct if wrong). So i am planning to study Sharepoint through some training centre and complete a Sharepoint certification. The main reason for me to go for Sharepoint is that i feel it is a niche skill and it will help me to get a job in abroad location in future. Please let me know whether sharepoint can help me to get a job in foreign location. I would also like to know whether, Is it possible to master Sharepoint without any experience in skills like WCF, Web Services etc? Is it possible to get a sharepoint job just with knowledge and certification in it? Incase if sharepoint will not offer me career growth, then can you please suggest me the skills which will offer great career growth (like foreign jobs) for me as a .Net developer?

    Read the article

  • Going Paperless

    - by Jesse
    One year ago I came to work for a company where the entire development team is 100% “remote”; we’re spread over 3 time zones and each of us works from home. This seems to be an increasingly popular way for people to work and there are many articles and blog posts out there enumerating the advantages and disadvantages of working this way. I had read a lot about telecommuting before accepting this job and felt as if I had a pretty decent idea of what I was getting into, but I’ve encountered a few things over the past year that I did not expect. Among the most surprising by-products of working from home for me has been a dramatic reduction in the amount of paper that I use on a weekly basis. Hoarding In The Workplace Prior to my current telecommute job I worked in what most would consider pretty traditional office environments. I sat in cubicles furnished with an enormous plastic(ish) modular desks, had a mediocre (at best) PC workstation, and had ready access to a seemingly endless supply of legal pads, pens, staplers and paper clips. The ready access to paper, countless conference room meetings, and abundance of available surface area on my desk and in drawers created a perfect storm for wasting paper. I brought a pad of paper with me to every meeting I ever attended, scrawled some brief notes, and then tore that sheet off to keep next to my keyboard to follow up on any needed action items. Once my immediate need for the notes was fulfilled, that sheet would get shuffled off into a corner of my desk or filed away in a drawer “just in case”. I would guess that for all of the notes that I ever filed away, I might have actually had to dig up and refer to 2% of them (and that’s probably being very generous). That said, on those rare occasions that I did have to dig something up from old notes, it was usually pretty important and I ended up being very glad that I saved them. It was only when I would leave a job or move desks that I would finally gather all those notes together and take them to shredding bin to be disposed of. When I left my last job the amount of paper I had accumulated over my three years there was absurd, and I knew coworkers who had substance-abuse caliber paper wasting addictions that made my bad habit look like nail-biting in comparison. A Product Of My Environment I always hated using all of this paper, but simply couldn’t bring myself to stop. It would look bad if I showed up to an important conference room meeting without a pad of paper. What if someone said something profound! Plus, everyone else always brought paper with them. If you saw someone walking down the hallway with a pad of paper in hand you knew they must be on their way to a conference room meeting. Some people even had fancy looking portfolio notebook sheaths that gave their legal pads all the prestige of a briefcase. No one ever worried about running out of fresh paper because there was an endless supply, and there certainly was no shortage of places to store and file used paper. In short, the traditional office was setup for using tons and tons of paper; it’s baked into the culture there. For that reason, it didn’t take long for me to kick the paper habit once I started working from home. In my home office, desk and drawer space are at a premium. I don’t have the budget (or the tolerance) for huge modular office furniture in my spare bedroom. I also no longer have access to a bottomless pit of office supplies stock piled in cabinets and closets. If I want to use some paper, I have to go out and buy it. Finally (and most importantly), all of the meetings that I have to attend these days are “virtual”. We use instant messaging, VOIP, video conferencing, and e-mail to communicate with each other. All I need to take notes during a meeting is my computer, which I happen to be sitting right in front of all day. I don’t have any hard numbers for this, but my gut feeling is that I actually take a lot more notes now than I ever did when I worked in an office. The big difference is I don’t have to use any paper to do so. This makes it far easier to keep important information safe and organized. The Right Tool For The Job When I first started working from home I tried to find a single application that would fill the gap left by the pen and paper that I always had at my desk when I worked in an office. Well, there are no silver bullets and I’ve evolved my approach over time to try and find the best tool for the job at hand. Here’s a quick summary of how I take notes and keep everything organized. Notepad++ – This is the first application I turn to when I feel like there’s some bit of information that I need to write down and save. I use Launchy, so opening Notepad++ and creating a new file only takes a few keystrokes. If I find that the information I’m trying to get down requires a more sophisticated application I escalate as needed. The Desktop – By default, I save every file or other bit of information to the desktop. Anyone who has ever had to fix their parents computer before knows that this is a dangerous game (any file my mother has ever worked on is saved directly to the desktop and rarely moves anywhere else). I agree that storing things on the desktop isn’t a great long term approach to keeping organized, which is why I treat my desktop a bit like my e-mail inbox. I strive to keep both empty (or as close to empty as I possibly can). If something is on my desktop, it means that it’s something relevant to a task or project that I’m currently working on. About once a week I take things that I’m not longer working on and put them into my ‘Notes’ folder. The ‘Notes’ Folder – As I work on a task, I tend to accumulate multiple files associated with that task. For example, I might have a bit of SQL that I’m working on to gather data for a new report, a quick C# method that I came up with but am not yet ready to commit to source control, a bulleted list of to-do items in a .txt file, etc. If the desktop starts to get too cluttered, I create a new sub-folder in my ‘Notes’ folder. Each sub-folder’s name is the current date followed by a brief description of the task or project. Then all files related to that task or project go into that sub folder. By using the date as the first part of the folder name, these folders are automatically sorted in reverse chronological order. This means that things I worked on recently will generally be near the top of the list. Using the built-in Windows search functionality I now have a pretty quick and easy way to try and find something that I worked on a week ago or six months ago. Dropbox – Dropbox is a free service that lets you store up to 2GB of files “in the cloud” and have those files synced to all of the different computers that you use. My ‘Notes’ folder lives in Dropbox, meaning that it’s contents are constantly backed up and are always available to me regardless of which computer I’m using. They also have a pretty decent iPhone application that lets you browse and view all of the files that you have stored there. The free 2GB edition is probably enough for just storing notes, but I also pay $99/year for the 50GB storage upgrade and keep all of my music, e-books, pictures, and documents in Dropbox. It’s a fantastic service and I highly recommend it. Evernote – I use Evernote mostly to organize information that I access on a fairly regular basis. For example, my Evernote account has a running grocery shopping list, recipes that my wife and I use a lot, and contact information for people I contact infrequently enough that I don’t want to keep them in my phone. I know some people that keep nearly everything in Evernote, but there’s something about it that I find a bit clunky, so I tend to use it sparingly. Google Tasks – One of my biggest paper wasting habits was keeping a running task-list next to my computer at work. Every morning I would sit down, look at my task list, cross off what was done and add new tasks that I thought of during my morning commute. This usually resulted in having to re-copy the task list onto a fresh sheet of paper when I was done. I still keep a running task list at my desk, but I’ve started using Google Tasks instead. This is a dead-simple web-based application for quickly adding, deleting, and organizing tasks in a simple checklist style. You can quickly move tasks up and down on the list (which I use for prioritizing), and even create sub-tasks for breaking down larger tasks into smaller pieces. Balsamiq Mockups – This is a simple and lightweight tool for creating drawings of user interfaces. It’s great for sketching out a new feature, brainstorm the layout of a interface, or even draw up a quick sequence diagram. I’m terrible at drawing, so Balsamiq Mockups not only lets me create sketches that other people can actually understand, but it’s also handy because you can upload a sketch to a common location for other team members to access. I can honestly say that using these tools (and having limited resources at home) have lead me to cut my paper usage down to virtually none. If I ever were to return to a traditional office workplace (hopefully never!) I’d try to employ as many of these applications and techniques as I could to keep paper usage low. I feel far less cluttered and far better organized now.

    Read the article

  • Why is HTML/Javascript minification beneficial

    - by Channel72
    Why is HTML/Javascript minification beneficial when the HTTP protocol already supports gzip data compression? I realize that Javascript/HTML minification has the potential to significantly reduce the size of Javascript/HTML files by removing unnecessary whitespace, and perhaps renaming variables to a few letters each, but doesn't the LZW algorithm do especially well when there are many repeated characters (e.g. lots of whitespace?) I realize that some Javascript minification tools do more than just reduce size. Google's closure compiler, for example, also tries to improve code performance by inlining functions and doing other analyses. But the primary purpose of Javascript minification is usually to reduce file size. I also realize there are other reasons you might want to minify aside from performace, such as code obfuscation. But again, that reason is not usually emphasized as much as performance gain and file size reduction. For example, Closure Compiler is not advertised as an obfuscation tool, but as a code size reducer and download-speed enhancer. So, how much performance do you really gain from Javascript/HTML minification when you're already significantly reducing file size with gzip compression?

    Read the article

  • Switching to a career in Machine Learning

    - by Naive Machine Learner
    My day job is plain old software development. I am also doing my Masters in CS (part time, course based). I took a course on AI and found machine learning quite fascinating but like most courses it only offered a basic intro. I intend to learn more about Machine Learning and if possible get a job in that field. When I look at job postings in this field it is clear that a Phd in Machine learning (or prior experience in the field with considerable expertise) is required for most of them. I'm looking for advice on self learning to gain experience that'll useful in industry. At least, enough experience to get my foot in. I will do the obvious ones like reading text books, papers etc. Perhaps any open source efforts that I can participate in or something I could do on my own? Apologies if I'm being vague here but I hope there are at least a few of you who done a similar switch and can advise. Thanks !

    Read the article

  • 10 Windows Tweaking Myths Debunked

    - by Chris Hoffman
    Windows is big, complicated, and misunderstood. You’ll still stumble across bad advice from time to time when browsing the web. These Windows tweaking, performance, and system maintenance tips are mostly just useless, but some are actively harmful. Luckily, most of these myths have been stomped out on mainstream sites and forums. However, if you start searching the web, you’ll still find websites that recommend you do these things. Erase Cache Files Regularly to Speed Things Up You can free up disk space by running an application like CCleaner, another temporary-file-cleaning utility, or even the Windows Disk Cleanup tool. In some cases, you may even see an old computer speed up when you erase a large amount of useless files. However, running CCleaner or similar utilities every day to erase your browser’s cache won’t actually speed things up. It will slow down your web browsing as your web browser is forced to redownload the files all over again, and reconstruct the cache you regularly delete. If you’ve installed CCleaner or a similar program and run it every day with the default settings, you’re actually slowing down your web browsing. Consider at least preventing the program from wiping out your web browser cache. Enable ReadyBoost to Speed Up Modern PCs Windows still prompts you to enable ReadyBoost when you insert a USB stick or memory card. On modern computers, this is completely pointless — ReadyBoost won’t actually speed up your computer if you have at least 1 GB of RAM. If you have a very old computer with a tiny amount of RAM — think 512 MB — ReadyBoost may help a bit. Otherwise, don’t bother. Open the Disk Defragmenter and Manually Defragment On Windows 98, users had to manually open the defragmentation tool and run it, ensuring no other applications were using the hard drive while it did its work. Modern versions of Windows are capable of defragmenting your file system while other programs are using it, and they automatically defragment your disks for you. If you’re still opening the Disk Defragmenter every week and clicking the Defragment button, you don’t need to do this — Windows is doing it for you unless you’ve told it not to run on a schedule. Modern computers with solid-state drives don’t have to be defragmented at all. Disable Your Pagefile to Increase Performance When Windows runs out of empty space in RAM, it swaps out data from memory to a pagefile on your hard disk. If a computer doesn’t have much memory and it’s running slow, it’s probably moving data to the pagefile or reading data from it. Some Windows geeks seem to think that the pagefile is bad for system performance and disable it completely. The argument seems to be that Windows can’t be trusted to manage a pagefile and won’t use it intelligently, so the pagefile needs to be removed. As long as you have enough RAM, it’s true that you can get by without a pagefile. However, if you do have enough RAM, Windows will only use the pagefile rarely anyway. Tests have found that disabling the pagefile offers no performance benefit. Enable CPU Cores in MSConfig Some websites claim that Windows may not be using all of your CPU cores or that you can speed up your boot time by increasing the amount of cores used during boot. They direct you to the MSConfig application, where you can indeed select an option that appears to increase the amount of cores used. In reality, Windows always uses the maximum amount of processor cores your CPU has. (Technically, only one core is used at the beginning of the boot process, but the additional cores are quickly activated.) Leave this option unchecked. It’s just a debugging option that allows you to set a maximum number of cores, so it would be useful if you wanted to force Windows to only use a single core on a multi-core system — but all it can do is restrict the amount of cores used. Clean Your Prefetch To Increase Startup Speed Windows watches the programs you run and creates .pf files in its Prefetch folder for them. The Prefetch feature works as a sort of cache — when you open an application, Windows checks the Prefetch folder, looks at the application’s .pf file (if it exists), and uses that as a guide to start preloading data that the application will use. This helps your applications start faster. Some Windows geeks have misunderstood this feature. They believe that Windows loads these files at boot, so your boot time will slow down due to Windows preloading the data specified in the .pf files. They also argue you’ll build up useless files as you uninstall programs and .pf files will be left over. In reality, Windows only loads the data in these .pf files when you launch the associated application and only stores .pf files for the 128 most recently launched programs. If you were to regularly clean out the Prefetch folder, not only would programs take longer to open because they won’t be preloaded, Windows will have to waste time recreating all the .pf files. You could also modify the PrefetchParameters setting to disable Prefetch, but there’s no reason to do that. Let Windows manage Prefetch on its own. Disable QoS To Increase Network Bandwidth Quality of Service (QoS) is a feature that allows your computer to prioritize its traffic. For example, a time-critical application like Skype could choose to use QoS and prioritize its traffic over a file-downloading program so your voice conversation would work smoothly, even while you were downloading files. Some people incorrectly believe that QoS always reserves a certain amount of bandwidth and this bandwidth is unused until you disable it. This is untrue. In reality, 100% of bandwidth is normally available to all applications unless a program chooses to use QoS. Even if a program does choose to use QoS, the reserved space will be available to other programs unless the program is actively using it. No bandwidth is ever set aside and left empty. Set DisablePagingExecutive to Make Windows Faster The DisablePagingExecutive registry setting is set to 0 by default, which allows drivers and system code to be paged to the disk. When set to 1, drivers and system code will be forced to stay resident in memory. Once again, some people believe that Windows isn’t smart enough to manage the pagefile on its own and believe that changing this option will force Windows to keep important files in memory rather than stupidly paging them out. If you have more than enough memory, changing this won’t really do anything. If you have little memory, changing this setting may force Windows to push programs you’re using to the page file rather than push unused system files there — this would slow things down. This is an option that may be helpful for debugging in some situations, not a setting to change for more performance. Process Idle Tasks to Free Memory Windows does things, such as creating scheduled system restore points, when you step away from your computer. It waits until your computer is “idle” so it won’t slow your computer and waste your time while you’re using it. Running the “Rundll32.exe advapi32.dll,ProcessIdleTasks” command forces Windows to perform all of these tasks while you’re using the computer. This is completely pointless and won’t help free memory or anything like that — all you’re doing is forcing Windows to slow your computer down while you’re using it. This command only exists so benchmarking programs can force idle tasks to run before performing benchmarks, ensuring idle tasks don’t start running and interfere with the benchmark. Delay or Disable Windows Services There’s no real reason to disable Windows services anymore. There was a time when Windows was particularly heavy and computers had little memory — think Windows Vista and those “Vista Capable” PCs Microsoft was sued over. Modern versions of Windows like Windows 7 and 8 are lighter than Windows Vista and computers have more than enough memory, so you won’t see any improvements from disabling system services included with Windows. Some people argue for not disabling services, however — they recommend setting services from “Automatic” to “Automatic (Delayed Start)”. By default, the Delayed Start option just starts services two minutes after the last “Automatic” service starts. Setting services to Delayed Start won’t really speed up your boot time, as the services will still need to start — in fact, it may lengthen the time it takes to get a usable desktop as services will still be loading two minutes after booting. Most services can load in parallel, and loading the services as early as possible will result in a better experience. The “Delayed Start” feature is primarily useful for system administrators who need to ensure a specific service starts later than another service. If you ever find a guide that recommends you set a little-known registry setting to improve performance, take a closer look — the change is probably useless. Want to actually speed up your PC? Try disabling useless startup programs that run on boot, increasing your boot time and consuming memory in the background. This is a much better tip than doing any of the above, especially considering most Windows PCs come packed to the brim with bloatware.     

    Read the article

  • Announcement: DTrace for Oracle Linux General Availability

    - by Zeynep Koch
    Today we are announcing the general availability of DTrace for Oracle Linux. It is available to download from ULN for Oracle Linux Support customers.  DTrace is a comprehensive dynamic tracing framework that was initially developed for the Oracle Solaris operating system, and is now available to Oracle Linux customers. DTrace is designed to give operational insights that allow users to tune and troubleshoot the operating system. DTrace provides Oracle Linux developers with a tool to analyze performance, and increase observability into the systems they own to see how they work. DTrace enables higher quality applications development, reduced downtime, lower cost, and greater utilization of existing resources. Key benefits and features of DTrace on Oracle Linux include: • Designed to work on finding performance bottlenecks • Dynamically enables the kernel with a number of probe points, improving ability to service software • Enables maximum resource utilization and application performance • Fast and easy to use, even on complex systems with multiple layers of software If you already have Oracle Linux support, you can download DTrace from ULN channel. We have a dedicated Forum for DTrace on Oracle Linux, to discuss your experience and questions.

    Read the article

  • Oracle Enterprise Manager sessions on the last day of the Oracle Open World

    - by Anand Akela
    Hope you had a very productive Oracle Open World so far . Hopefully, many of you attended the customer appreciation event yesterday night at the Treasures Islands.   We still have many enterprise manager related sessions today on Thursday, last day of Oracle Open World 2012. Download the Oracle Enterprise Manager 12c OpenWorld schedule (PDF) Oracle Enterprise Manager Cloud Control 12c (and Private Cloud) Time Title Location 11:15 AM - 12:15 PM Application Performance Matters: Oracle Real User Experience Insight Palace Hotel - Sea Cliff 11:15 AM - 12:15 PM Advanced Management of JD Edwards EnterpriseOne with Oracle Enterprise Manager InterContinental - Grand Ballroom B 11:15 AM - 12:15 PM Spark on SPARC Servers: Enterprise-Class IaaS with Oracle Enterprise Manager 12c Moscone West - 3018 11:15 AM - 12:15 PM Pinpoint Production Applications’ Performance Bottlenecks by Using JVM Diagnostics Marriott Marquis - Golden Gate C3 11:15 AM - 12:15 PM Bringing Order to the Masses: Scalable Monitoring with Oracle Enterprise Manager 12c Moscone West - 3020 12:45 PM - 1:45 PM Improving the Performance of Oracle E-Business Suite Applications: Tips from a DBA’s Diary Moscone West - 2018 12:45 PM - 1:45 PM Advanced Management of Oracle PeopleSoft with Oracle Enterprise Manager Moscone West - 3009 12:45 PM - 1:45 PM Managing Sun Servers and Oracle Engineered Systems with Oracle Enterprise Manager Moscone West - 2000 12:45 PM - 1:45 PM Strategies for Configuring Oracle Enterprise Manager 12c in a Secure IT Environment Moscone West - 3018 12:45 PM - 1:45 PM Using Oracle Enterprise Manager 12c to Control Operational Costs Moscone South - 308 2:15 PM - 3:15 PM My Oracle Support: The Proactive 24/7 Assistant for Your Oracle Installations Moscone West - 3018 2:15 PM - 3:15 PM Functional and Load Testing Tips and Techniques for Advanced Testers Moscone South - 307 2:15 PM - 3:15 PM Oracle Enterprise Manager Deployment Best Practices Moscone South - 104 Stay Connected: Twitter | Facebook | YouTube | Linkedin | Newsletter

    Read the article

  • SQLAuthority News – Resolution for New Year 2011

    - by pinaldave
    Today is the first day of the year so I want to write something very light. Last Year: 2010 Last Year was a blast; really traveled a lot. My family and I went on vacation. There I enjoyed being father, rolling on the floor and playing with my daughter. Here is the list of the countries I visited throughout 2010: Singapore (twice) Malaysia (twice) Sri Lanka (thrice) Nepal (once) United States of America (twice) United Arab Emirates (UAE) (once) My daughter who just completed 1 year on September 1, 2010 has so far visited three countries: Singapore, Malaysia and Sri Lanka, where I have done lots of community activities. The list containing all my activities can be found at Pinal Dave’s Community Events. I have written nearly 380 blog posts last year. It would be difficult for me to pick a few. However, I keep a running list of all of my articles over here: All Articles on SQLAuthority.com. I have so far received more than 10,000 email questions during the year and consequently I have done my best to answer most of them. I strongly believe if one would Search SQLAuthority.com blog, they would have found the answer quickly. The best part of 2010 for me was working on SQL Server Health Check and SQL Server Performance Tuning. This Year: 2011 This year, I came up with two simple goals: 1. Personal Goal: Reduce Weight 2. Professional Goal: Stay busy for the entire year with SQL Server Performance Tuning Projects. (Currently January 2011 is booked with performance tuning projects and 40 other days are already booked throughout the year). Future The future is something one cannot exactly guess and one cannot see. I just want to wish all of you the very best for this coming New Year. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • DIVIDE vs division operator in #dax

    - by Marco Russo (SQLBI)
    Alberto Ferrari wrote an interesting article about DIVIDE performance in DAX. This new function has been introduced in SQL Server Analysis Services 2012 SP1, so it is available also in Excel 2013 (which still doesn’t have other features/fixes introduced by following Cumulative Updates…). The idea that instead of writing: IF ( Sales[Quantity] <> 0, Sales[Amount] / Sales[Quantity], BLANK () ) you can write: DIVIDE ( Sales[Amount], Sales[Quantity] ) There is a third optional argument in DIVIDE that defines the result in case the denominator (second argument) is zero, and by default its value is BLANK, so I omitted the third argument in my example. Using DIVIDE is very important, especially when you use a measure in MDX (for example in an Excel PivotTable) because it raise the chance that the non empty evaluation for the result is evaluated in bulk mode instead of cell-by-cell. However, from a DAX point of view, you might find it’s better to use the standard division operator removing the IF statement. I suggest you to read Alberto’s article, because you will find that an expression applying a filter using FILTER is faster than using CALCULATE, which is against any rule of thumb you might have read until now! Again, this is not always true, and depends on many conditions – trying to simplify, we might say that for a simple calculation, the query plan generated by FILTER could be more efficient – but, as usual, it depends, and 90% of the times using FILTER instead of CALCULATE produces slower performance. Do not take anything for granted, and always check the query plan when performance are your first issue!

    Read the article

  • Thank You MySQL Community! MySQL 5.6.9 Release Candidate Available Now!

    - by Rob Young
    The MySQL Community continues its good work in testing and refining MySQL 5.6, and as such the next iteration of the 5.6 Release Candidate is now available for download.  You can get MySQL 5.6.9 here (look under the "Development Releases" tab).  This version is the result of feedback we have gotten since MySQL 5.6.7 was announced at MySQL Connect in late September. As iron sharpens iron, Community feedback sharpens the quality and performance of MySQL so please download 5.6.9 and let us know how we can improve it as we move toward the production-ready product release in early 2013. MySQL 5.6 is designed to meet the agility demands of the next generation of web apps and services and includes across the board improvements to the Optimizer, InnoDB performance/scale and online DDL operations, self-healing Replication, Performance Schema Instrumentation, Security and developer enabling NoSQL functionality.  You can learn all the details and follow MySQL Engineering blogs on all of the key features in this MySQL DevZone article. On a related note, plan to join this week's live webinars to learn more about MySQL 5.6 Self-Healing Replication Clusters and Building the Next Generation of Web, Cloud, SaaS, Embedded Application and Services with MySQL 5.6.  Hurry!  Seating is limited!  As always, thanks for your continued support of MySQL!

    Read the article

  • rsync problems and security concerns

    - by MB.
    Hi I am attempting to use rsync to copy files between two linux servers. both on 10.04.4 I have set up the ssh and a script running under a cron job. this is the message i get back from the cron job. To: mark@ubuntu Subject: Cron ~/rsync.sh Content-Type: text/plain; charset=ANSI_X3.4-1968 X-Cron-Env: X-Cron-Env: X-Cron-Env: X-Cron-Env: Message-Id: <20120708183802.E0D54FC2C0@ubuntu Date: Sun, 8 Jul 2012 14:38:01 -0400 (EDT) rsync: link_stat "/home/mark/#342#200#223rsh=ssh" failed: No such file or directory (2) rsync: opendir "/Library/WebServer/Documents/.cache" failed: Permission denied (13) rsync: recv_generator: mkdir "/Library/Library" failed: Permission denied (13) * Skipping any contents from this failed directory * rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1060) [sender=3.0.7] Q.1 can anyone tell me why I get this message -- rsync: link_stat "/home/mark/#342#200#223rsh=ssh" failed: No such file or directory (2) the script is: #!/bin/bash SOURCEPATH='/Library' DESTPATH='/Library' DESTHOST='192.168.1.15' DESTUSER='mark' LOGFILE='rsync.log' echo $'\n\n' >> $LOGFILE rsync -av –rsh=ssh $SOURCEPATH $DESTUSER@$DESTHOST:$DESTPATH 2>&1 >> $LOGFILE echo “Completed at: `/bin/date`” >> $LOGFILE Q2. I know I have several problems with the permissions all of the files I am copying usually require me to use sudo to manipulate them. My question is then is there a way i can run this job without giving my user root access or using root in the login ?? Thanks for the help .

    Read the article

  • JRockit R28 "Ropsten" released

    - by tomas.nilsson
    R28 is a major release (as indicated by the careless omissions of "minor" and "revision" numbers. The formal name would be R28.0.0). Our customers expect grand new features and innovation from major releases, and "Ropsten" will not disappoint. One of the biggest challenges for IT systems is after the fact diagnostics. That is - Once something has gone wrong, the act of trying to figure out why it went wrong. Monitoring a system and keeping track of system health once it is running is considered a hard problem (one that we to some extent help our customers solve already with JRockit Mission Control), but doing it after something occurred is close to impossible. The most common solution is to set up heavy logging (and sacrificing system performance to do the logging) and hope that the problem occurs again. No one really thinks that this is a good solution, but it's the best there is. Until now. Inspired by the "Black box" in airplanes, JRockit R28 introduces the Flight Recorder. Flight Recorder can be seen as an extremely detailed log, but one that is always on and that comes without a cost to system performance. With JRockit Flight Recorder the customer will be able to get diagnostics information about what happened _before_ a problem occurred, instead of trying to guess by looking at the fallout. Keywords that are important to the customer are: • Extremely detailed, always on, diagnostics information • No performance overhead • Powerful tooling to visualize the data recorded. • Enables diagnostics of bugs and SLA breaches after the fact. For followers of JRockit, other additions are: • New JMX agent that allows JRMC to be used through firewalls more easily • Option to generate HPROF dumps, compatible with tools like Eclipse MAT • Up to 64 BG compressed references (previously 4) • View memory allocation on a thread level (as an Mbean and in Mission Control) • Native memory tracking (Command line and Mbean) • More robust optimizer. • Dropping support for Java 1.4.2 and Itanium If you have any further questions, please email [email protected]. The release can be downloaded from http://www.oracle.com/technology/software/products/jrockit/index.html

    Read the article

  • SQL SERVER – ORDER BY ColumnName vs ORDER BY ColumnNumber

    - by pinaldave
    I strongly favor ORDER BY ColumnName. I read one of the blog post where blogger compared the performance of the two SELECT statement and come to conclusion that ColumnNumber has no harm to use it. Let us understand the point made by first that there is no performance difference. Run following two scripts together: USE AdventureWorks GO -- ColumnName (Recommended) SELECT * FROM HumanResources.Department ORDER BY GroupName, Name GO -- ColumnNumber (Strongly Not Recommended) SELECT * FROM HumanResources.Department ORDER BY 3,2 GO If you look at the result and see the execution plan you will see that both of the query will take the same amount of the time. However, this was not the point of this blog post. It is not good enough to stop here. We need to understand the advantages and disadvantages of both the methods. Case 1: When Not Using * and Columns are Re-ordered USE AdventureWorks GO -- ColumnName (Recommended) SELECT GroupName, Name, ModifiedDate, DepartmentID FROM HumanResources.Department ORDER BY GroupName, Name GO -- ColumnNumber (Strongly Not Recommended) SELECT GroupName, Name, ModifiedDate, DepartmentID FROM HumanResources.Department ORDER BY 3,2 GO Case 2: When someone changes the schema of the table affecting column order I will let you recreate the example for the same. If your development server where your schema is different than the production server, if you use ColumnNumber, you will get different results on the production server. Summary: When you develop the query it may not be issue but as time passes by and new columns are added to the SELECT statement or original table is re-ordered if you have used ColumnNumber it may possible that your query will start giving you unexpected results and incorrect ORDER BY. One should note that the usage of ORDER BY ColumnName vs ORDER BY ColumnNumber should not be done based on performance but usability and scalability. It is always recommended to use proper ORDER BY clause with ColumnName to avoid any confusion. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Download PSSDIAG Data Collection Utility

    - by pinaldave
    During an early career of mine as a database consultant – when I was dealing with SQL Server 2000, I often needed to collect various data related to SQL Server. My favorite tool to collect the data is PSSDIAG tool. It is a general purpose diagnostic collection utility that Microsoft Product Support Services uses to collect various logs and data files. It collects Performance Monitor logs, SQL Profiler traces, SQL Server blocking script output, Windows Event Logs, and SQLDIAG output. The data collected can be used by SQL Nexus tool which help you troubleshoot SQL Server performance problems. PSSDIAG is a wrapper around other data collection APIs and utilities, the performance impact of running PSSDIAG is generally equal to the impact of the traces that PSSDIAG has been configured to capture. If you are using SQL Server 2000 – you need to seriously consider to upgrading it to SQL Server 2012. Here is a PSSDIAG Data Collection Utility updated in August 2012. My friend and SQL Server Expert Amit Benerjee have written an excellent article on this subject, I encourage all of you to read the same. Note: For SQL Server 2012 there is SQLDiag. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • Are programmers a bunch of heartless robots who are lacking of empathy? [closed]

    - by Graviton
    OK, the provocative title got your attention. My experience as a programmer and dealing with my fellow programmers is that, a programmer is also usually someone who is so consumed by his programming work, so absorbed in his algorithmic construction that he has little passion/ time left for anything else, which includes empathy for other people, love and care for the people whom he love or should love ( such as their spouses, parents, kids, colleagues etc). The better a person is in terms of his programming powers, the more defective he is in terms of love/care because both honing programming skills and loving the surrounding takes time and one has only so much time to be allocated among so many different things. Also, programming ( especially INTERESTING programming job, like, writing an AI to predict the future search trend) is a highly consuming job; it doesn't just consume you from 9 to 5, it will also consume you after 5 and practically every second of your waking hours because a good programmer can't just magically switch off his thinking hat after the office lights go off ( If you can then I don't really think you are a passionate programmer, and the prerequisite of a good programmer is passion). So, a good programmer is necessarily someone who can't love as much as others do because the very nature of the programming job prevents him from loving others as much as he wants to. Do you concur with my observation/ reasoning?

    Read the article

< Previous Page | 243 244 245 246 247 248 249 250 251 252 253 254  | Next Page >