Search Results

Search found 1323 results on 53 pages for 'actionable bi'.

Page 20/53 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • Using Image Source with big images in WPF

    - by xyzzer
    I am working on an application that allows users to manipulate multiple images by using ItemsControl. I started running some tests and found that the app has problems displaying some big images - ie. it did not work with the high resolution (21600x10800), 20MB images from http://earthobservatory.nasa.gov/Features/BlueMarble/BlueMarble_monthlies.php, though it displays the 6200x6200, 60MB Hubble telescope image from http://zebu.uoregon.edu/hudf/hudf.jpg just fine. The original solution just specified an Image control with a Source property pointing at a file on a disk (through a binding). With the Blue Marble file - the image would just not show up. Now this could be just a bug hidden somewhere deep in the funky MVVM + XAML implementation - the visual tree displayed by Snoop goes like: Window/Border/AdornerDecorator/ContentPresenter/Grid/Canvas/UserControl/Border/ContentPresenter/Grid/Grid/Grid/Grid/Border/Grid/ContentPresenter/UserControl/UserControl/Border/ContentPresenter/Grid/Grid/Grid/Grid/Viewbox/ContainerVisual/UserControl/Border/ContentPresenter/Grid/Grid/ItemsControl/Border/ItemsPresenter/Canvas/ContentPresenter/Grid/Grid/ContentPresenter/Image... Now debug this! WPF can be crazy like that... Anyway, it turned out that if I create a simple WPF application - the images load just fine. I tried finding out the root cause, but I don't want to spend weeks on it. I figured the right thing to do might be to use a converter to scale the images down - this is what I have done: ImagePath = @"F:\Astronomical\world.200402.3x21600x10800.jpg"; TargetWidth = 2800; TargetHeight = 1866; and <Image> <Image.Source> <MultiBinding Converter="{StaticResource imageResizingConverter}"> <MultiBinding.Bindings> <Binding Path="ImagePath"/> <Binding RelativeSource="{RelativeSource Self}" /> <Binding Path="TargetWidth"/> <Binding Path="TargetHeight"/> </MultiBinding.Bindings> </MultiBinding> </Image.Source> </Image> and public class ImageResizingConverter : MarkupExtension, IMultiValueConverter { public Image TargetImage { get; set; } public string SourcePath { get; set; } public int DecodeWidth { get; set; } public int DecodeHeight { get; set; } public object Convert(object[] values, Type targetType, object parameter, CultureInfo culture) { this.SourcePath = values[0].ToString(); this.TargetImage = (Image)values[1]; this.DecodeWidth = (int)values[2]; this.DecodeHeight = (int)values[3]; return DecodeImage(); } private BitmapImage DecodeImage() { BitmapImage bi = new BitmapImage(); bi.BeginInit(); bi.DecodePixelWidth = (int)DecodeWidth; bi.DecodePixelHeight = (int)DecodeHeight; bi.UriSource = new Uri(SourcePath); bi.EndInit(); return bi; } public object[] ConvertBack(object value, Type[] targetTypes, object parameter, CultureInfo culture) { throw new Exception("The method or operation is not implemented."); } public override object ProvideValue(IServiceProvider serviceProvider) { return this; } } Now this works fine, except for one "little" problem. When you just specify a file path in Image.Source - the application actually uses less memory and works faster than if you use BitmapImage.DecodePixelWidth. Plus with Image.Source if you have multiple Image controls that point to the same image - they only use as much memory as if only one image was loaded. With the BitmapImage.DecodePixelWidth solution - each additional Image control uses more memory and each of them uses more than when just specifying Image.Source. Perhaps WPF somehow caches these images in compressed form while if you specify the decoded dimensions - it feels like you get an uncompressed image in memory, plus it takes 6 times the time (perhaps without it the scaling is done on the GPU?), plus it feels like the original high resolution image also gets loaded and takes up space. If I just scale the image down, save it to a temporary file and then use Image.Source to point at the file - it will probably work, but it will be pretty slow and it will require handling cleanup of the temporary file. If I could detect an image that does not get loaded properly - maybe I could only scale it down if I need to, but Image.ImageFailed never gets triggered. Maybe it has something to do with the video memory and this app just using more of it with the deep visual tree, opacity masks etc. Actual question: How can I load big images as quickly as Image.Source option does it, without using more memory for additional copies and additional memory for the scaled down image if I only need them at a certain resolution lower than original? Also, I don't want to keep them in memory if no Image control is using them anymore.

    Read the article

  • added shell script to sudoers still getting permission denied

    - by Bill S
    I don't understand this? Other uses of sudo work fine. [oracle@o plugins]$ su Password: [root@ plugins]# su nrpe bash-3.2$ /home/oracle/obiee/instances/instance1/bifoundation/OracleBIApplication/coreapplication/setup/bi-init.sh bash: /home/oracle/obiee/instances/instance1/bifoundation/OracleBIApplication/coreapplication/setup/bi-init.sh: Permission denied bash-3.2$ sudo -l Matching Defaults entries for nrpe on this host: env_reset, env_keep="COLORS DISPLAY HOSTNAME HISTSIZE INPUTRC KDEDIR LS_COLORS MAIL PS1 PS2 QTDIR USERNAME LANG LC_ADDRESS LC_CTYPE LC_COLLATE LC_IDENTIFICATION LC_MEASUREMENT LC_MESSAGES LC_MONETARY LC_NAME LC_NUMERIC LC_PAPER LC_TELEPHONE LC_TIME LC_ALL LANGUAGE LINGUAS _XKB_CHARSET XAUTHORITY" Runas and Command-specific defaults for nrpe: User nrpe may run the following commands on this host: (ALL) NOPASSWD: /home/oracle/obiee/instances/instance1/bifoundation/OracleBIApplication/coreapplication/setup/bi-init.sh bash-3.2$

    Read the article

  • How do I catch this WPF Bitmap loading exception?

    - by mmr
    I'm developing an application that loads bitmaps off of the web using .NET 3.5 sp1 and C#. The loading code looks like: try { CurrentImage = pics[unChosenPics[index]]; bi = new BitmapImage(CurrentImage.URI); // BitmapImage.UriSource must be in a BeginInit/EndInit block. bi.DownloadCompleted += new EventHandler(bi_DownloadCompleted); AssessmentImage.Source = bi; } catch { System.Console.WriteLine("Something broke during the read!"); } and the code to load on bi_DownloadCompleted is: void bi_DownloadCompleted(object sender, EventArgs e) { try { double dpi = 96; int width = bi.PixelWidth; int height = bi.PixelHeight; int stride = width * 4; // 4 bytes per pixel byte[] pixelData = new byte[stride * height]; bi.CopyPixels(pixelData, stride, 0); BitmapSource bmpSource = BitmapSource.Create(width, height, dpi, dpi, PixelFormats.Bgra32, null, pixelData, stride); AssessmentImage.Source = bmpSource; Loading.Visibility = Visibility.Hidden; AssessmentImage.Visibility = Visibility.Visible; } catch { System.Console.WriteLine("Exception when viewing bitmap."); } } Every so often, an image comes along that breaks the reader. I guess that's to be expected. However, rather than being caught by either of those try/catch blocks, the exception is apparently getting thrown outside of where I can handle it. I could handle it using global WPF exceptions, like this SO question. However, that will seriously mess up the control flow of my program, and I'd like to avoid that if at all possible. I have to do the double source assignment because it appears that many images are lacking in width/height parameters in the places where the microsoft bitmap loader expects them to be. So, the first assignment appears to force the download, and the second assignment gets the dpi/image dimensions happen properly. What can I do to catch and handle this exception? Stack trace: at MS.Internal.HRESULT.Check(Int32 hr) at System.Windows.Media.Imaging.BitmapFrameDecode.get_ColorContexts() at System.Windows.Media.Imaging.BitmapImage.FinalizeCreation() at System.Windows.Media.Imaging.BitmapImage.OnDownloadCompleted(Object sender, EventArgs e) at System.Windows.Media.UniqueEventHelper.InvokeEvents(Object sender, EventArgs args) at System.Windows.Media.Imaging.LateBoundBitmapDecoder.DownloadCallback(Object arg) at System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Boolean isSingleParameter) at System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Boolean isSingleParameter, Delegate catchHandler) at System.Windows.Threading.DispatcherOperation.InvokeImpl() at System.Threading.ExecutionContext.runTryCode(Object userData) at System.Runtime.CompilerServices.RuntimeHelpers.ExecuteCodeWithGuaranteedCleanup(TryCode code, CleanupCode backoutCode, Object userData) at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Windows.Threading.DispatcherOperation.Invoke() at System.Windows.Threading.Dispatcher.ProcessQueue() at System.Windows.Threading.Dispatcher.WndProcHook(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled) at MS.Win32.HwndWrapper.WndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled) at MS.Win32.HwndSubclass.DispatcherCallbackOperation(Object o) at System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Boolean isSingleParameter) at System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Boolean isSingleParameter, Delegate catchHandler) at System.Windows.Threading.Dispatcher.InvokeImpl(DispatcherPriority priority, TimeSpan timeout, Delegate method, Object args, Boolean isSingleParameter) at MS.Win32.HwndSubclass.SubclassWndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam) at MS.Win32.UnsafeNativeMethods.DispatchMessage(MSG& msg) at System.Windows.Threading.Dispatcher.TranslateAndDispatchMessage(MSG& msg) at System.Windows.Threading.Dispatcher.PushFrameImpl(DispatcherFrame frame) at System.Windows.Application.RunInternal(Window window) at LensComparison.App.Main() in C:\Users\Mark64\Documents\Visual Studio 2008\Projects\LensComparison\LensComparison\obj\Release\App.g.cs:line 48 at System.AppDomain._nExecuteAssembly(Assembly assembly, String[] args) at Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly() at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Threading.ThreadHelper.ThreadStart()

    Read the article

  • Installing SharePoint 2010 and PowerPivot for SharePoint on Windows 7

    - by smisner
    Many people like me want (or need) to do their business intelligence development work on a laptop. As someone who frequently speaks at various events or teaches classes on all subjects related to the Microsoft business intelligence stack, I need a way to run multiple server products on my laptop with reasonable performance. Once upon a time, that requirement meant only that I had to load the current version of SQL Server and the client tools of choice. In today's post, I'll review my latest experience with trying to make the newly released Microsoft BI products work with a Windows 7 operating system.The entrance of Microsoft Office SharePoint Server 2007 into the BI stack complicated matters and I started using Virtual Server to establish a "suitable" environment. As part of the team that delivered a lot of education as part of the Yukon pre-launch activities (that would be SQL Server 2005 for the uninitiated), I was working with four - yes, four - virtual servers. That was a pretty brutal workload for a 2GB laptop, which worked if I was very, very careful. It could also be a finicky and unreliable configuration as I learned to my dismay at one TechEd session several years ago when I had to reboot a very carefully cached set of servers just minutes before my session started. Although it worked, it came back to life very, very slowly much to the displeasure of the audience. They couldn't possibly have been less pleased than me.At that moment, I resolved to get the beefiest environment I could afford and consolidate to a single virtual server. Enter the 4GB 64-bit laptop to preserve my sanity and my livelihood. Likewise, for SQL Server 2008, I managed to keep everything within a single virtual server and I could function reasonably well with this approach.Now we have SQL Server 2008 R2 plus Office SharePoint Server 2010. That means a 64-bit operating system. Period. That means no more Virtual Server. That means I must use Hyper-V or another alternative. I've heard alternatives exist, but my few dabbles in this area did not yield positive results. It might have been just me having issues rather than any failure of those technologies to adequately support the requirements.My first run at working with the new BI stack configuration was to set up a 64-bit 4GB laptop with a dual-boot to run Windows Server 2008 R2 with Hyper-V. However, I was generally not happy with running Windows Server 2008 R2 on my laptop. For one, I couldn't put it into sleep mode, which is helpful if I want to prepare for a presentation beforehand and then walk to the podium without the need to hold my laptop in its open state along the way (my strategy at the TechEd session long, long ago). Secondly, it was finicky with projectors. I had issues from time to time and while I always eventually got it to work, I didn't appreciate those nerve-wracking moments wondering whether this would be the time that it wouldn't work.Somewhere along the way, I learned that it was possible to load SharePoint 2010 in a Windows 7 which piqued my interest. I had just acquired a new laptop running Windows 7 64-bit, and thought surely running the BI stack natively on my laptop must be better than running Hyper-V. (I have not tried booting to Hyper-V VHD yet, but that's on my list of things to try so the jury of one is still out on this approach.) Recently, I had to build up a server with the RTM versions of SQL Server 2008 R2 and Sharepoint Server 2010 and decided to follow suit on my Windows 7 Ultimate 64-bit laptop. The process is slightly different, but I'm happy to report that it IS possible, although I had some fits and starts along the way.DISCLAIMER: These products are NOT intended to be run in production mode on the Windows 7 operating system. The configuration described in this post is strictly for development or learning purposes and not supported by Microsoft. If you have trouble, you will NOT get help from them. I might be able to help, but I provide no guarantees of my ability or availablity to help. I won't provide the step-by-step instructions in this post as there are other resources that provide these details, but I will provide an overview of my approach, point you to the relevant resources, describe some of the problems I encountered, and explain how I addressed those problems to achieve my desired goal.Because my goal was not simply to set up SharePoint Server 2010 on my laptop, but specifically PowerPivot for SharePoint, I started out by referring to the installation instructions at the PowerPiovt-Info site, but mainly to confirm that I was performing steps in the proper sequence. I didn't perform the steps in Part 1 because those steps are applicable only to a server operating system which I am not running on my laptop. Then, the instructions in Part 2, won't work exactly as written for the same reason. Instead, I followed the instructions on MSDN, Setting Up the Development Environment for SharePoint 2010 on Windows Vista, Windows 7, and Windows Server 2008. In general, I found the following differences in installation steps from the steps at PowerPivot-Info:You must copy the SharePoint installation media to the local drive so that you can edit the config.xml to allow installation on a Windows client.You also have to manually install the prerequisites. The instructions provides links to each item that you must manually install and provides a command-line instruction to execute which enables required Windows features.I will digress for a moment to save you some grief in the sequence of steps to perform. I discovered later that a missing step in the MSDN instructions is to install the November CTP Reporting Services add-in for SharePoint. When I went to test my SharePoint site (I believe I tested after I had a successful PowerPivot installation), I ran into the following error: Could not load file or assembly 'RSSharePointSoapProxy, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. I was rather surprised that Reporting Services was required. Then I found an article by Alan le Marquand, Working Together: SQL Server 2008 R2 Reporting Services Integration in SharePoint 2010,that instructed readers to install the November add-in. My first reaction was, "Really?!?" But I confirmed it in another TechNet article on hardware and software requirements for SharePoint Server 2010. It doesn't refer explicitly to the November CTP but following the link took me there. (Interestingly, I retested today and there's no longer any reference to the November CTP. Here's the link to download the latest and greatest Reporting Services Add-in for SharePoint Technologies 2010.) You don't need to download the add-in anymore if you're doing a regular server-based installation of SharePoint because it installs as part of the prerequisites automatically.When it was time to start the installation of SharePoint, I deviated from the MSDN instructions and from the PowerPivot-Info instructions:On the Choose the installation you want page of the installation wizard, I chose Server Farm.On the Server Type page, I chose Complete.At the end of the installation, I did not run the configuration wizard.Returning to the PowerPivot-Info instructions, I tried to follow the instructions in Part 3 which describe installing SQL Server 2008 R2 with the PowerPivot option. These instructions tell you to choose the New Server option on the Setup Role page where you add PowerPivot for SharePoint. However, I ran into problems with this approach and got installation errors at the end.It wasn't until much later as I was investigating an error that I encountered Dave Wickert's post that installing PowerPivot for SharePoint on Windows 7 is unsupported. Uh oh. But he did want to hear about it if anyone succeeded, so I decided to take the plunge. Perseverance paid off, and I can happily inform Dave that it does work so far. I haven't tested absolutely everything with PowerPivot for SharePoint but have successfully deployed a workbook and viewed the PowerPivot Management Dashboard. I have not yet tested the data refresh feature, but I have installed. Continue reading to see how I accomplished my objective.I unintalled SQL Server 2008 R2 and started again. I had different problems which I don't recollect now. However, I uninstalled again and approached installation from a different angle and my next attempt succeeded. The downside of this approach is that you must do all of the things yourself that are done automatically when you install PowerPivot as a new server. Here are the steps that I followed:Install SQL Server 2008 R2 to get a database engine instance installed.Run the SharePoint configuration wizard to set up the SharePoint databases.In Central Administration, create a Web application using classic mode authentication as per a TechNet article on PowerPivot Authentication and Authorization.Then I followed the steps I found at How to: Install PowerPivot for SharePoint on an Existing SharePoint Server. Especially important to note - you must launch setup by using Run as administrator. I did not have to manually deploy the PowerPivot solution as the instructions specify, but it's good to know about this step because it tells you where to look in Central Administration to confirm a successful deployment.I did spot some incorrect steps in the instructions (at the time of this writing) in How To: Configure Stored Credentials for PowerPivot Data Refresh. Specifically, in the section entitled Step 1: Create a target application and set the credentials, both steps 10 and 12 are incorrect. They tell you to provide an actual Windows user name and password on the page where you are simply defining the prompts for your application in the Secure Store Service. To add the Windows user name and password that you want to associate with the application - after you have successfully created the target application - you select the target application and then click Set credentials in the ribbon.Lastly, I followed the instructions at How to: Install Office Data Connectivity Components on a PowerPivot server. However, I have yet to test this in my current environment.I did have several stops and starts throughout this process and edited those out to spare you from reading non-essential information. I believe the explanation I have provided here accurately reflect the steps I followed to produce a working configuration. If you follow these steps and get a different result, please let me know so that together we can work through the issue and correct these instructions. I'm sure there are many other folks in the Microsoft BI community that will appreciate the ability to set up the BI stack in a Windows 7 environment for development or learning purposes. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Oracle's Primavera P6 Analytics Now Available!

    - by mark.kromer
    Oracle's Primavera product team has announced this week that general availability of our first Oracle BI (OBI) based analytical product with pre-built business intelligence dashboards, reports and KPIs built in. P6 Analytics uses OBI's drill-down capabilities, summarizations, hierarchies and other BI features to provide knowledge to your business users to make the best decisions on portfolios, projects, schedules & resources with deep insights. Without needing to launch into the P6 tool, your executives, PMO, project sponsors, etc. can view up to date project performance information as well as historic trends of project performance. Using web-based portal technology, P6 Analytics makes it easy to manage by exception and then drill down to quickly identify root cause analysis of problem projects. At the same time, a brand new version of the P6 Reporting Database R2 was just announded and is also now available. This updated reporting database provides you with 4 star schemas with spread data and includes P6 activity, project and resource codes. You can use the data warehouse and ETL functions of the P6 Reporting Database R2 with your own reporting tools or build dashboards that utilize the hierarchies & drill down to the day-level on scheduled activities using Busines Objects, Cognos, Microsoft, etc. Both of these products can be downloaded from E-Delivery under the Primavera applications section in the P6 EPPM v7.0 media pack. I put some examples below of the resource utilization, earned value, landing page and portfolio analysis dashboards that come out of the box with P6 Analytics to give you these deep insights into your projects & portfolios on day 1 of using the tool. Please send an email to Karl or me if you have any questions or would like more information. Oracle Technology Network and the Oracle.com marketing sites are currently being refreshed with further details of these exciting new releases of the Primavera BI and data warehouse products. Lastly, scroll below for some screenshots of the new P6 Analytics R1 product using OBIEE! Thanks, Mark Kromer

    Read the article

  • Upgrading from 2005 to R2

    - by DavidWimbush
    We're about to take the plunge and upgrade our servers from SQL 2005 to SQL 2008 R2. Real world accounts of people upgrading to R2 are a bit hard to find so I thought it might be useful to blog what happens. (I don't count marketing 'case studies' that just say stuff like "The process was effortless and the upgrade will pay for itself by the end the week.") We're using the database engine, Analysis Services and Reporting Services so upgrading by a major version number was looking a bit daunting. I wasn't expecting much trouble on the engine side of things but, as most of the action in 2008 and R2 appears to have been on the Reporting and BI front, I expected to have quite a bit of work to do. But our testing so far has been one nice surprise after another: The 2005 backups restore cleanly onto R2. R2's BI Studio upgraded the Reporting and Analysis Services solutions without any issues. The cubes all deployed and processed just fine. R2 BI Studio interacts fine with TFS 2008 version control. I'll blog some more as things develop.  

    Read the article

  • Oracle Big Data Learning Library - Click on LEARN BY PRODUCT to Open Page

    - by chberger
    Oracle Big Data Learning Library... Learn about Oracle Big Data, Data Science, Learning Analytics, Oracle NoSQL Database, and more! Oracle Big Data Essentials Attend this Oracle University Course! Using Oracle NoSQL Database Attend this Oracle University class! Oracle and Big Data on OTN See the latest resource on OTN. Search Welcome Get Started Learn by Role Learn by Product Latest Additions Additional Resources Oracle Big Data Appliance Oracle Big Data and Data Science Basics Meeting the Challenge of Big Data Oracle Big Data Tutorial Video Series Oracle MoviePlex - a Big Data End-to-End Series of Demonstrations Oracle Big Data Overview Oracle Big Data Essentials Data Mining Oracle NoSQL Database Tutorial Videos Oracle NoSQL Database Tutorial Series Oracle NoSQL Database Release 2 New Features Using Oracle NoSQL Database Exalytics Enterprise Manager 12c R3: Manage Exalytics Setting Up and Running Summary Advisor on an E s Oracle R Enterprise Oracle R Enterprise Tutorial Series Oracle Big Data Connectors Integrate All Your Data with Oracle Big Data Connectors Using Oracle Direct Connector for HDFS to Read the Data from HDSF Using Oracle R Connector for Hadoop to Analyze Data Oracle NoSQL Database Oracle NoSQL Database Tutorial Videos Oracle NoSQL Database Tutorial Series Oracle NoSQL Database Release 2 New Features  Using Oracle NoSQL Database eries Oracle Business Intelligence Enterprise Edition Oracle Business Intelligence Oracle BI 11g R1: Create Analyses and Dashboards - 4 day class Oracle BI Publisher 11g R1: Fundamentals - 3 day class Oracle BI 11g R1: Build Repositories - 5 day class

    Read the article

  • OBIEE 11.1.1.5 or above: Admin Server as a single point of failure (SPOF) is REALLY not impacting OBIEE work

    - by Ahmed Awan
    Applies To: 11.1.1.5, 11.1.1.6 Admin Server as a single point of failure (SPOF) is REALLY not impacting OBIEE work. By setting virtualize tag to true (in EM) to manage multiple LDAP providers, it is enabling failover and HA on authentication and authorization inside OBIEE.   Following are the test cases used for testing impact on OBIEE, if Admin Server is not available:   a. Test 1: Admin Server crashes and impact on OBIEE Scenario: All OBIEE components are up and running.   b. Test 2: Admin Server had not been started and impact on OBIEE. Scenario: OBIEE Server bi_server1 is started, but Admin Server isn’t   For more details on each of the above test, click here to download the Test Results   Links to Official documentations below: http://docs.oracle.com/cd/E23943_01/bi.1111/e10543/privileges.htm#BIESC6077 http://docs.oracle.com/cd/E23943_01/bi.1111/e10543/privileges.htm#BABHFFEI http://docs.oracle.com/cd/E23943_01/bi.1111/e10543/authentication.htm#BIESC6075

    Read the article

  • Oracle Insurance Gets Innovative with Insurance Business Intelligence

    - by nicole.bruns(at)oracle.com
    Oracle Insurance announced yesterday the availability of Oracle Insurance Insight 7.0, an insurance-specific data warehouse and business intelligence (BI) system that transforms the traditional approach to BI by involving business users in the creation and maintenance."Rapid access to business intelligence is essential to compete and thrive in today's insurance industry," said Srini Venkatasantham, vice president, Product Strategy, Oracle Insurance. "The adaptive data modeling approach of Oracle Insurance Insight 7.0, combined with the insurance-specific data model, offers global insurance companies a faster, easier way to get the intelligence they need to make better-informed business decisions." New Features in Oracle Insurance 7.0 include:"Adaptive Data Modeling" via the new warehouse palette: Gives business users the power to configure lines of business via an easy-to-use warehouse palette tool. Oracle Insurance Insight then automatically creates data warehouse elements - such as line-specific database structures and extract-transform-load (ETL) processes -speeding up time-to-value for BI initiatives. Out-of-the-box insurance models or create-from-scratch option: Includes pre-built content and interfaces for six Property and Casualty (P&C) lines. Additionally, insurers can use the warehouse palette to deploy any and all P&C or General Insurance lines of business from scratch, helping insurers support operations in any country.Leverages Oracle technologies: In addition to Oracle Business Intelligence Enterprise Edition, the solution includes Oracle Database 11g as well as Oracle Data Integrator Enterprise Edition 11g, which delivers Extract, Load and Transform (E-L-T) architecture and eliminates the need for a separate transformation server. Additionally, the expanded Oracle technology infrastructure enables support for Oracle Exadata. Martina Conlon, a Principal with Novarica's Insurance practice, and author of Business Intelligence in Insurance: Current State, Challenges, and Expectations says, "The need for continued investment by insurers in business intelligence capabilities is widely understood, and the industry is acting. Arming the business intelligence implementation with predefined insurance specific content, and flexible and configurable technology will get these projects up and running faster."Learn moreTo see a demo of the Oracle Insurance Insight system, click hereTo read the press announcement, click here

    Read the article

  • Installing SharePoint 2010 and PowerPivot for SharePoint on Windows 7

    - by smisner
    Many people like me want (or need) to do their business intelligence development work on a laptop. As someone who frequently speaks at various events or teaches classes on all subjects related to the Microsoft business intelligence stack, I need a way to run multiple server products on my laptop with reasonable performance. Once upon a time, that requirement meant only that I had to load the current version of SQL Server and the client tools of choice. In today's post, I'll review my latest experience with trying to make the newly released Microsoft BI products work with a Windows 7 operating system. The entrance of Microsoft Office SharePoint Server 2007 into the BI stack complicated matters and I started using Virtual Server to establish a "suitable" environment. As part of the team that delivered a lot of education as part of the Yukon pre-launch activities (that would be SQL Server 2005 for the uninitiated), I was working with four - yes, four - virtual servers. That was a pretty brutal workload for a 2GB laptop, which worked if I was very, very careful. It could also be a finicky and unreliable configuration as I learned to my dismay at one TechEd session several years ago when I had to reboot a very carefully cached set of servers just minutes before my session started. Although it worked, it came back to life very, very slowly much to the displeasure of the audience. They couldn't possibly have been less pleased than me. At that moment, I resolved to get the beefiest environment I could afford and consolidate to a single virtual server. Enter the 4GB 64-bit laptop to preserve my sanity and my livelihood. Likewise, for SQL Server 2008, I managed to keep everything within a single virtual server and I could function reasonably well with this approach. Now we have SQL Server 2008 R2 plus Office SharePoint Server 2010. That means a 64-bit operating system. Period. That means no more Virtual Server. That means I must use Hyper-V or another alternative. I've heard alternatives exist, but my few dabbles in this area did not yield positive results. It might have been just me having issues rather than any failure of those technologies to adequately support the requirements. My first run at working with the new BI stack configuration was to set up a 64-bit 4GB laptop with a dual-boot to run Windows Server 2008 R2 with Hyper-V. However, I was generally not happy with running Windows Server 2008 R2 on my laptop. For one, I couldn't put it into sleep mode, which is helpful if I want to prepare for a presentation beforehand and then walk to the podium without the need to hold my laptop in its open state along the way (my strategy at the TechEd session long, long ago). Secondly, it was finicky with projectors. I had issues from time to time and while I always eventually got it to work, I didn't appreciate those nerve-wracking moments wondering whether this would be the time that it wouldn't work. Somewhere along the way, I learned that it was possible to load SharePoint 2010 in a Windows 7 which piqued my interest. I had just acquired a new laptop running Windows 7 64-bit, and thought surely running the BI stack natively on my laptop must be better than running Hyper-V. (I have not tried booting to Hyper-V VHD yet, but that's on my list of things to try so the jury of one is still out on this approach.) Recently, I had to build up a server with the RTM versions of SQL Server 2008 R2 and Sharepoint Server 2010 and decided to follow suit on my Windows 7 Ultimate 64-bit laptop. The process is slightly different, but I'm happy to report that it IS possible, although I had some fits and starts along the way. DISCLAIMER: These products are NOT intended to be run in production mode on the Windows 7 operating system. The configuration described in this post is strictly for development or learning purposes and not supported by Microsoft. If you have trouble, you will NOT get help from them. I might be able to help, but I provide no guarantees of my ability or availablity to help. I won't provide the step-by-step instructions in this post as there are other resources that provide these details, but I will provide an overview of my approach, point you to the relevant resources, describe some of the problems I encountered, and explain how I addressed those problems to achieve my desired goal. Because my goal was not simply to set up SharePoint Server 2010 on my laptop, but specifically PowerPivot for SharePoint, I started out by referring to the installation instructions at the PowerPiovt-Info site, but mainly to confirm that I was performing steps in the proper sequence. I didn't perform the steps in Part 1 because those steps are applicable only to a server operating system which I am not running on my laptop. Then, the instructions in Part 2, won't work exactly as written for the same reason. Instead, I followed the instructions on MSDN, Setting Up the Development Environment for SharePoint 2010 on Windows Vista, Windows 7, and Windows Server 2008. In general, I found the following differences in installation steps from the steps at PowerPivot-Info: You must copy the SharePoint installation media to the local drive so that you can edit the config.xml to allow installation on a Windows client. You also have to manually install the prerequisites. The instructions provides links to each item that you must manually install and provides a command-line instruction to execute which enables required Windows features. I will digress for a moment to save you some grief in the sequence of steps to perform. I discovered later that a missing step in the MSDN instructions is to install the November CTP Reporting Services add-in for SharePoint. When I went to test my SharePoint site (I believe I tested after I had a successful PowerPivot installation), I ran into the following error: Could not load file or assembly 'RSSharePointSoapProxy, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. I was rather surprised that Reporting Services was required. Then I found an article by Alan le Marquand, Working Together: SQL Server 2008 R2 Reporting Services Integration in SharePoint 2010,that instructed readers to install the November add-in. My first reaction was, "Really?!?" But I confirmed it in another TechNet article on hardware and software requirements for SharePoint Server 2010. It doesn't refer explicitly to the November CTP but following the link took me there. (Interestingly, I retested today and there's no longer any reference to the November CTP. Here's the link to download the latest and greatest Reporting Services Add-in for SharePoint Technologies 2010.) You don't need to download the add-in anymore if you're doing a regular server-based installation of SharePoint because it installs as part of the prerequisites automatically. When it was time to start the installation of SharePoint, I deviated from the MSDN instructions and from the PowerPivot-Info instructions: On the Choose the installation you want page of the installation wizard, I chose Server Farm. On the Server Type page, I chose Complete. At the end of the installation, I did not run the configuration wizard. Returning to the PowerPivot-Info instructions, I tried to follow the instructions in Part 3 which describe installing SQL Server 2008 R2 with the PowerPivot option. These instructions tell you to choose the New Server option on the Setup Role page where you add PowerPivot for SharePoint. However, I ran into problems with this approach and got installation errors at the end. It wasn't until much later as I was investigating an error that I encountered Dave Wickert's post that installing PowerPivot for SharePoint on Windows 7 is unsupported. Uh oh. But he did want to hear about it if anyone succeeded, so I decided to take the plunge. Perseverance paid off, and I can happily inform Dave that it does work so far. I haven't tested absolutely everything with PowerPivot for SharePoint but have successfully deployed a workbook and viewed the PowerPivot Management Dashboard. I have not yet tested the data refresh feature, but I have installed. Continue reading to see how I accomplished my objective. I unintalled SQL Server 2008 R2 and started again. I had different problems which I don't recollect now. However, I uninstalled again and approached installation from a different angle and my next attempt succeeded. The downside of this approach is that you must do all of the things yourself that are done automatically when you install PowerPivot as a new server. Here are the steps that I followed: Install SQL Server 2008 R2 to get a database engine instance installed. Run the SharePoint configuration wizard to set up the SharePoint databases. In Central Administration, create a Web application using classic mode authentication as per a TechNet article on PowerPivot Authentication and Authorization. Then I followed the steps I found at How to: Install PowerPivot for SharePoint on an Existing SharePoint Server. Especially important to note - you must launch setup by using Run as administrator. I did not have to manually deploy the PowerPivot solution as the instructions specify, but it's good to know about this step because it tells you where to look in Central Administration to confirm a successful deployment. I did spot some incorrect steps in the instructions (at the time of this writing) in How To: Configure Stored Credentials for PowerPivot Data Refresh. Specifically, in the section entitled Step 1: Create a target application and set the credentials, both steps 10 and 12 are incorrect. They tell you to provide an actual Windows user name and password on the page where you are simply defining the prompts for your application in the Secure Store Service. To add the Windows user name and password that you want to associate with the application - after you have successfully created the target application - you select the target application and then click Set credentials in the ribbon. Lastly, I followed the instructions at How to: Install Office Data Connectivity Components on a PowerPivot server. However, I have yet to test this in my current environment. I did have several stops and starts throughout this process and edited those out to spare you from reading non-essential information. I believe the explanation I have provided here accurately reflect the steps I followed to produce a working configuration. If you follow these steps and get a different result, please let me know so that together we can work through the issue and correct these instructions. I'm sure there are many other folks in the Microsoft BI community that will appreciate the ability to set up the BI stack in a Windows 7 environment for development or learning purposes. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • SQLAuthority News – Download Whitepaper – Choosing a Tabular or Multidimensional Modeling Experience in SQL Server 2012 Analysis Services

    - by pinaldave
    Data modeling is the most important task for any BI professional. Matter of the fact, the biggest challenge is to organizing disparate data into an analytic model that effectively and efficiently supports the reporting and analysis. SQL Server 2012 introduces BI Semantic Model (BISM), a single model that can support a broad range of reporting and analysis while blending two Analysis Services modeling experiences behind the scenes. Multidimensional modeling – enables BI professionals to create sophisticated multidimensional cubes using traditional online analytical processing (OLAP). Tabular modeling – provides self-service data modeling capabilities to business and data analysts. As data modeling is evolving and business needs are growing new technologies and tools are emerging to help end users to make the necessary adjustment to the reporting and analysis needs. This white paper is will provide practical guidance to help you decide which SQL Server 2012 Analysis Services modeling experience – tabular or multidimensional. Do let me know what do is your opinion as a comment. In simple word – I would like to know when will you use Tabular modeling and when Multidimensional modeling? Download Choosing a Tabular or Multidimensional Modeling Experience in SQL Server 2012 Analysis Services Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Business Intelligence, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, T SQL, Technology

    Read the article

  • PASS Business Intelligence Virtual Chapter Upcoming Sessions (November 2013)

    - by Sergio Govoni
    Let me point out the upcoming live events, dedicated to Business Intelligence with SQL Server, that PASS Business Intelligence Virtual Chapter has scheduled for November 2013. The "Accidental Business Intelligence Project Manager"Date: Thursday 7th November - 8:00 PM GMT / 3:00 PM EST / Noon PSTSpeaker: Jen StirrupURL: https://attendee.gotowebinar.com/register/5018337449405969666 You've watched the Apprentice with Donald Trump and Lord Alan Sugar. You know that the Project Manager is usually the one gets firedYou've heard that Business Intelligence projects are prone to failureYou know that a quick Bing search for "why do Business Intelligence projects fail?" produces a search result of 25 million hits!Despite all this… you're now Business Intelligence Project Manager – now what do you do?In this session, Jen will provide a "sparks from the anvil" series of steps and working practices in Business Intelligence Project Management. What about waterfall vs agile? What is a Gantt chart anyway? Is Microsoft Project your friend or a problematic aspect of being a BI PM? Jen will give you some ideas and insights that will help you set your BI project right: assess priorities, avoid conflict, empower the BI team and generally deliver the Business Intelligence project successfully! Dimensional Modelling Design Patterns: Beyond BasicsDate: Tuesday 12th November - Noon AEDT / 1:00 AM GMT / Monday 11th November 5:00 PM PSTSpeaker: Jason Horner, Josh Fennessy and friendsURL: https://attendee.gotowebinar.com/register/852881628115426561 This session will provide a deeper dive into the art of dimensional modeling. We will look at the different types of fact tables and dimension tables, how and when to use them. We will also some approaches to creating rich hierarchies that make reporting a snap. This session promises to be very interactive and engaging, bring your toughest Dimensional Modeling quandaries. Data Vault Data Warehouse ArchitectureDate: Tuesday 19th November - 4:00 PM PST / 7 PM EST / Wednesday 20th November 11:00 PM AEDTSpeaker: Jeff Renz and Leslie WeedURL: https://attendee.gotowebinar.com/register/1571569707028142849 Data vault is a compelling architecture for an enterprise data warehouse using SQL Server 2012. A well designed data vault data warehouse facilitates fast, efficient and maintainable data integration across business systems. In this session Leslie and I will review the basics about enterprise data warehouse design, introduce you to the data vault architecture and discuss how you can leverage new features of SQL Server 2012 help make your data warehouse solution provide maximum value to your users. 

    Read the article

  • SSIS Denali as part of “Enterprise Information Management”

    - by jorg
    When watching the SQL PASS session “What’s Coming Next in SSIS?” of Steve Swartz, the Group Program Manager for the SSIS team, an interesting question came up: Why is SSIS thought of to be BI, when we use it so frequently for other sorts of data problems? The answer of Steve was that he breaks the world of data work into three parts: Process of inputs BI   Enterprise Information Management All the work you have to do when you have a lot of data to make it useful and clean and get it to the right place. This covers master data management, data quality work, data integration and lineage analysis to keep track of where the data came from. All of these are part of Enterprise Information Management. Next, Steve told Microsoft is developing SSIS as part of a large push in all of these areas in the next release of SQL. So SSIS will be, next to a BI tool, part of Enterprise Information Management in the next release of SQL Server. I'm interested in the different ways people use SSIS, I've basically used it for ETL, data migrations and processing inputs. In which ways did you use SSIS?

    Read the article

  • Oracle ADF and Simplified UI Apps: I18n Feng Shui on Display

    - by ultan o'broin
    I demoed the Hebrew language version of Oracle Sales Cloud Release 8 live in Israel recently. The crowd was yet again wowed by the simplified UI (SUI). I’ve now spent some time playing around with most of the 23 language versions, or the NLS (Natural Language Support) versions as we’d call them, available in Release 8. Hebrew Oracle Sales Cloud Release 8 The simplified UI is built using 100% Oracle ADF. This framework is a great solution for developers to productively build tablet-first, mobility-driven apps for users who work and live using natural languages other than English. Oracle ADF’s internationalization (i18n) relies on built-in Java and Unicode,  packing in i18n goodness such as Bi-Di (or bi-directional) flipping of pages, locale-enabled resource bundles, date and time support, and so on. Comparing German (left) and Hebrew Bi-Di (right) page components in the simplified UI. Note the change in the direction of the arrows and positions of the text. So, developers who need to build global apps don’t have to do anything special when using Oracle ADF components, all thanks to the baked-in UX Feng Shui, as Grant Ronald of the ADF team would say to the UK Oracle User Group. Find out more  about  ADF i18n from Frédéric Desbiens (@blueberrycoder)  on the ADF Architecture TV channel.

    Read the article

  • QueryUnit 0.0.0.8 – Trust No One

    - by Davide Mauri
    Yesterday I’ve release an updated version of QueryUnit, the version 0.0.0.8. QueryUnit now supports AreNotEqual, Greater, and Less assertions and is more capable of managing strings results. I must say that I cannot live anymore without a proper Unit Testing of a BI solution. Just yesterday happened that one of the unit tests at a customer site failed showing a subtle situation where the release of a new version of custom application would have corrupted the source of BI data with a very low chance that someone would have noticed it before several days. It may happen when you have more the 15 systems that handles the data needed by your BI solution. The key message of this situation is “Trust No One”: if your data hasn’t passed quality testing it’s not trustable. Period. QueryUnit is now officialy an hero :) No superpowers still, but useful above all. http://queryunit.codeplex.com/ Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • OBIEE 11.1.1 - User Interface (UI) Performance Is Slow With Internet Explorer 8

    - by Ahmed A
    The OBIEE 11g UI is performance is slow in IE 8 and faster in Firefox.  For VPN or WAN users, it takes long time to display links on Dashboards via IE 8. Cause is IE 8 generates many HTTP 304 return calls and this caused the 11g UI slower when compared to the Mozilla FireFox browser. To resolve this issue, you can implement HTTP compression and caching. This is a best practice.Why use Web Server Compression / Caching for OBIEE? Bandwidth Savings: Enabling HTTP compression can have a dramatic improvement on the latency of responses. By compressing static files and dynamic application responses, it will significantly reduce the remote (high latency) user response time. Improves request/response latency: Caching makes it possible to suppress the payload of the HTTP reply using the 304 status code.  Minimizing round trips over the Web to re-validate cached items can make a huge difference in browser page load times. This screen shot depicts the flow and where the compression and decompression occurs: Solution: a. How to Enable HTTP Caching / Compression in Oracle HTTP Server (OHS) 11.1.1.x 1. To implement HTTP compression / caching, install and configure Oracle HTTP Server (OHS) 11.1.1.x for the bi_serverN Managed Servers (refer to "OBIEE Enterprise Deployment Guide for Oracle Business Intelligence" document for details). 2. On the OHS machine, open the file HTTP Server configuration file (httpd.conf) for editing. This file is located in the OHS installation directory.For example: ORACLE_HOME/Oracle_WT1/instances/instance1/config/OHS/ohs13. In httpd.conf file, verify that the following directives are included and not commented out: LoadModule expires_module "${ORACLE_HOME}/ohs/modules/mod_expires.soLoadModule deflate_module "${ORACLE_HOME}/ohs/modules/mod_deflate.so 4. Add the following lines in httpd.conf file below the directive LoadModule section and restart the OHS: Note: For the Windows platform, you will need to enclose any paths in double quotes ("), for example:Alias "/analytics ORACLE_HOME/bifoundation/web/app"<Directory "ORACLE_HOME/bifoundation/web/app"> Alias /analytics ORACLE_HOME/bifoundation/web/app#Pls replace the ORACLE_HOME with your actual BI ORACLE_HOME path<Directory ORACLE_HOME/bifoundation/web/app>#We don't generate proper cross server ETags so disable themFileETag noneSetOutputFilter DEFLATE# Don't compress imagesSetEnvIfNoCase Request_URI \.(?:gif|jpe?g|png)$ no-gzip dont-vary<FilesMatch "\.(gif|jpeg|png|js|x-javascript|javascript|css)$">#Enable future expiry of static filesExpiresActive onExpiresDefault "access plus 1 week"     #1 week, this will stops the HTTP304 calls i.e. generated by IE 8Header set Cache-Control "max-age=604800"</FilesMatch>DirectoryIndex default.jsp</Directory>#Restrict access to WEB-INF<Location /analytics/WEB-INF>Order Allow,DenyDeny from all</Location> Note: Make sure you replace above placeholder "ORACLE_HOME" to your correct path for BI ORACLE_HOME.For example: my BI Oracle Home path is /Oracle/BIEE11g/Oracle_BI1/bifoundation/web/app Important Notes: Above caching rules restricted to static files found inside the /analytics directory(/web/app). This approach is safer instead of setting static file caching globally. In some customer environments you may not get 100% performance gains in IE 8.0 browser. So in that case you need to extend caching rules to other directories with static files content. If OHS is installed on separate dedicated machine, make sure static files in your BI ORACLE_HOME (../Oracle_BI1/bifoundation/web/app) is accessible to the OHS instance. The following screen shot summarizes the before and after results and improvements after enabling compression and caching:

    Read the article

  • SQLAuthority News – Top 5 Latest Microsoft Certifications of 2013 – Guest Post

    - by Pinal Dave
    With the IT job market getting more and more competent by the day, certifications are a must for anyone who wishes to get a strong foothold in the industry. Microsoft community comes up with regular updates and enhancements in its existing products to keep up with the rapidly evolving requirements of the ICT industry. We bring you a list of five latest Microsoft certifications that you must consider acquiring this year. MCSE: SharePoint Learn all about Windows Server 2012 and Microsoft SharePoint 2013, which brings an advanced set of features to the fore in this latest version. It introduces new capabilities for business intelligence, social media, branding, search, identity management, mobile device among other features. Enjoy a great user experience with sharing and collaboration in community forum, within a pixel-perfect SharePoint website. Data connectivity and business intelligence tools allow users to process and access data, analyze reports, share and collaborate with each other more conveniently. Microsoft Specialist: Microsoft Project 2013 The only project management system that works seamlessly with other applications and cloud solutions of Microsoft, MS Project 2013 offers more than what meets the eye.  It provides for easier management and monitoring of projects so that users can ensure timely delivery while improving the productivity significantly. So keep all your projects on track and collaborate with your team like never before with this enhanced release! This one’s a must for all project managers. MCSE Messaging Another one of Microsoft gems is its messaging environment which has also launched the latest release Microsoft Exchange Server 2013. Messaging administrators can take up this training and validate their expertise in Unified Messaging, Exchange Online, PowerShell and Virtualization strategies, through MCSE Messaging certification in Exchange Server. If you wish to enhance productivity and data security of your organization while being flexible and extremely efficient, this is the right certification for you. MCSE Communication An enterprise can function optimally on the strength of its information flow and communication systems. With Lync Server 2013, you can introduce a whole new world of unified communications which consists of audio/video conferencing, dial-in, Persistent Chat, instant chat, and EDGE services in your organization. Utilize IT to serve and support business objectives by mastering this UC technology with this latest MCSE Communication course on using Microsoft Lync Server 2013. MCSE: SQL Server 2012 BI Platform The decision making process is largely influenced by underlying enterprise information used by the management for business intelligence. Therefore, a robust business intelligence platform that anchors enterprise IT and transform it to operational efficiencies is the need of the hour. SQL Server 2012 BI Platform certification helps professionals implement, manage and maintain a BI database infrastructure effectively. IT professionals with BI skills are highly sought after these days. MCSD: Windows Store Apps A Microsoft Certified Solutions Developer certification in Windows Store Apps validates your potential in designing interactive apps. Learn The Essentials of Developing Windows Store Apps using HTML5 and JavaScript and establish yourself as an ace developer capable of creating fast and fluid Metro style apps for Windows 8 that are accessible on a variety of devices. You can also go ahead and Learn Essentials of Developing Windows Store Apps using C# mode if you’re already familiar and working with C# programming language. Hence the developers are free to choose their own favorite development stream which opens doors for them to get ready for the latest and exciting application development platform called Windows store apps. Software developers with these skills are in great demand in the industry today. In order to continue being competitive in your respective fields, it is imperative that IT personnel update their knowledge on a regular basis. Certifications are a means to achieve this goal. Not considered to be an optional pre-requisite anymore, major IT certifications such as these are now essential to stay afloat in a cut-throat industry where technologies change on a daily basis. This blog is written by Aruneet Anand of Koenig Solutions. Koenig Solutions does training for all of the above courses. For more information, visit the website. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology Tagged: Microsoft Certifications

    Read the article

  • SQL SERVER – Performance Tuning Resolution

    - by pinaldave
    This blog post is written in response to T-SQL Tuesday hosted by MidnightDBAs. Taking resolutions is such an interesting subject. I think just like records, these are broken way more often. I find this is the funniest thing as we all take resolutions every year but not every year, we can manage to keep them. Well, does it mean we should not take resolutions? In fact I support resolutions. Every year, I take a resolution that I will strive reduce my body weight and I usually manage to keep eating healthy till the end of January. When February begins, I begin to loose focus from my goal and as March starts, the “As usual” eating habits begin. Looking at the positive side, what would happen if every year I do not eat healthy in January, I think that might cause terrible consequences to my health in the long run. So keeping resolutions is a good practise and following them to the extent one can is commendable. Let us come back to the world of SQL Server. What is my resolution for year 2011 for SQL Server? There are many, I am going to list three of very important resolutions that I have taken this new year over here. To understand SQL Server Performance Tuning at a deeper Level I think I am already half way through. I have been being very much busy during any given month doing hands-on performance tuning for at least 12 days on an average. That means, I am doing this activity for almost doing 2 weeks a month. I believe that I have a good understanding of the subject. Note that the word that I have used is “good,” and not “best.” There are often cases when I am stumped, and I have no clue of what to do next. Then, I usually go for my “trial and error” method - whichever method works, I make sure to keep a note on my blog. My goal is that I should never ever go for the trial and error method again to achieve the same solution. I should know the solution right away when I see the problem. I do understand that Performance Tuning can be a strange animal at times and one cannot guess the right step every time. However, aiming a high goal never hurts and I am going to learn more and more in this focused area. Going further from Basic BI understanding I do fairly decent with BI concepts. I know the nbasics of SSIS, SSRS, SSAS, PowerPivot and SharePoint (and few other things MDS, StreamInsight, etc). However, I still consider myself as a beginner. I do not have hands-on experience like many other BI Gurus around. I think I want to take my learning further in this direction. I do not want to be a BI expert as the first step but the goal is to move ahead from basic level towards an advanced level. I am going to start presenting in User Group Sessions and other places on this subject. When I have to prepare new subject for presentations, I think I force myself to learn more. I am committed to learn a bit more in this direction. Learning new features SQL Server 2011 Denali This is new thing from “Microsoft” for all the SQL Geeks. I am eagerly waiting for final product later this year and I am planning to learn it well. I think if I follow my above two goals, I think this goal will be automatically covered. I am eager and excited for this new offering from Microsoft. I guess, these are my resolutions; may be next year about the same time, I must revisit this post and see how much successful I am in following my goal. On a lighter note, I am particularly fan of following cartoon strip (Courtesy: Calvin and Hobbes). I think when we cannot resolve our resolutions, we tend to act like Calvin. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL 2012 Licensing Thoughts

    - by Geoff N. Hiten
    The only thing more controversial than new Federal Tax plans is new Licensing plans from Microsoft.  In both cases, everyone calculates several numbers.  First, will I pay more or less under this plan?  Second, will my competition pay more or less than now?  Third, will <insert interesting person/company here> pay more or less?  Not that items 2 and 3 are meaningful, that is just how people think. Much like tax plans, the devil is in the details, so lets see how this looks.  Microsoft shows it here: http://www.microsoft.com/sqlserver/en/us/future-editions/sql2012-licensing.aspx First up is a switch from per-socket to per-core licensing.  Anyone who didn’t see something like this coming should rapidly search for a new line of work because you are not paying attention.  The explosion of multi-core processors has made SQL Server a bargain.  Microsoft is in business to make money and the old per-socket model was not going to do that going forward. Per-core licensing also simplifies virtualization licensing.  Physical Core = Virtual Core, at least for licensing.  Oversubscribe your processors, that’s your lookout.  You still pay for  what is exposed to the VM.  The cool part is you can seamlessly move physical and virtual workloads around and the licenses follow.  The catch is you have to have Software Assurance to make the licenses mobile.  Nice touch there. Let’s have a moment of silence for the late, unlamented, largely ignored Workgroup Edition.  To quote the Microsoft  FAQ:  “Standard becomes our sole edition for basic database needs”.  Considering I haven’t encountered a singe instance of SQL Server Workgroup Edition in the wild, I don’t think this will be all that controversial. As for pricing, it looks like a wash with current per-socket pricing based on four core sockets.  Interestingly, that is the minimum core count Microsoft proposes to swap to transition per-socket to per-core if you are on Software Assurance.  Reading the fine print shows that if you are using more, you will get more core licenses: From the licensing FAQ. 15. How do I migrate from processor licenses to core licenses?  What is the migration path? Licenses purchased with Software Assurance (SA) will upgrade to SQL Server 2012 at no additional cost. EA/EAP customers can continue buying processor licenses until your next renewal after June 30, 2012. At that time, processor licenses will be exchanged for core-based licenses sufficient to cover the cores in use by processor-licensed databases (minimum of 4 cores per processor for Standard and Enterprise, and minimum of 8 EE cores per processor for Datacenter). Looks like the folks who invested in the AMD 12-core chips will make out like bandits. Now, on to something new: SQL Server Business Intelligence Edition. Yep, finally a BI-specific SKU licensed for server+CAL configurations only.  Note that Enterprise Edition still supports the complete feature set; the BI Edition is intended for smaller shops who want to use the full BI feature set but without needing Enterprise Edition scale (or costs).  No, you don’t get ColumnStore, Compression, or Partitioning in the BI Edition.  Those are Enterprise scale features, ThankYouVeryMuch.  Then again, your starting licensing costs are about one sixth of an Enterprise Edition system (based on an 8 core server). The only part of the message I am missing is if the current Failover Licensing Policy will change.  Do we need to fully or partially license failover servers?  That is a detail I definitely want to know.

    Read the article

  • How to make an entity out of a join table without primary key

    - by tputkonen
    I'm trying to generate JPA entities out of an existing database having an "interesting" design. Database has a table called UserSet, which can have links to several other UserSets. There is a one to many relation between UserSets and LinkedUserSets. LinkedUserSets also has one to one relation to UserSets. I tried to generate a JPA Entity out of the database structure using Dali JPA Tools. The resulting entity Linkeduserset misses @Id or @EmbeddedId annotation and thus failes to compile. As the resulting entity contains only two @JoinColumns (which cannot be marked as @Id), I have not so far found a way around this issue. Database structure can not be modified in any way. Is there a way to overcome this somehow? Relevant pars of create table statements: CREATE TABLE `LinkedUserSets` ( `UsrSetID` INT(11) NOT NULL DEFAULT '0' , `ChildID` INT(11) NOT NULL DEFAULT '0' , CONSTRAINT `fk_LinkedUserSets_UserSet1` FOREIGN KEY (`UsrSetID` ) REFERENCES `UserSet` (`UsrSetID` )); CREATE TABLE `UserSet` ( `UsrSetID` INT(11) NOT NULL AUTO_INCREMENT , PRIMARY KEY (`UsrSetID`), CONSTRAINT `fk_UserSet_LinkedUserSets1` FOREIGN KEY (`UsrSetID` ) REFERENCES `LinkedUserSets` (`ChildID` )); Generated entities: @Entity @Table(name="linkedusersets") public class Linkeduserset { //bi-directional many-to-one association to Userset @ManyToOne @JoinColumn(name="UsrSetID") private Userset userset1; //bi-directional one-to-one association to Userset @OneToOne @JoinColumn(name="ChildID") private Userset userset2; } @Entity @Table(name="userset") public class Userset { private static final long serialVersionUID = 1L; @Id @Column(name="UsrSetID") private int jngSetID; //bi-directional many-to-one association to Linkeduserset @OneToMany(mappedBy="userset1") private Set<Linkeduserset> linkedusersets; //bi-directional one-to-one association to Linkeduserset @OneToOne(mappedBy="userset2") private Linkeduserset linkeduserset; } Error message: Entity "Linkeduserset" has no Id or EmbeddedId

    Read the article

  • How to effectively color pixels in a BufferedImage?

    - by Ed Taylor
    I'm using the following pice of code to iterate over all pixels in an image and draw a red 1x1 square over the pixels that are within a certain RGB-tolerance. I guess there is a more efficient way to do this? Any ideas appreciated. (bi is a BufferedImage and g2 is a Graphics2D with its color set to Color.RED). Color targetColor = new Color(selectedRGB); for (int x = 0; x < bi.getWidth(); x++) { for (int y = 0; y < bi.getHeight(); y++) { Color pixelColor = new Color(bi.getRGB(x, y)); if (withinTolerance(pixelColor, targetColor)) { g2.drawRect(x, y, 1, 1); } } } private boolean withinTolerance(Color pixelColor, Color targetColor) { int pixelRed = pixelColor.getRed(); int pixelGreen = pixelColor.getGreen(); int pixelBlue = pixelColor.getBlue(); int targetRed = targetColor.getRed(); int targetGreen = targetColor.getGreen(); int targetBlue = targetColor.getBlue(); return (((pixelRed >= targetRed - tolRed) && (pixelRed <= targetRed + tolRed)) && ((pixelGreen >= targetGreen - tolGreen) && (pixelGreen <= targetGreen + tolGreen)) && ((pixelBlue >= targetBlue - tolBlue) && (pixelBlue <= targetBlue + tolBlue))); }

    Read the article

  • Sharing same vector control between different places

    - by Alexander K
    Hi everyone, I'm trying to implement the following: I have an Items Manager, that has an Item class inside. Item class can store two possible visual representations of it - BitmapImage(bitmap) and UserControl(vector). Then later, in the game, I need to share the same image or vector control between all possible places it takes place. For example, consider 10 trees on the map, and all point to the same vector control. Or in some cases this can be bitmap image source. So, the problem is that BitmapImage source can be easily shared in the application by multiple UIElements. However, when I try to share vector control, it fails, and says Child Element is already a Child element of another control. I want to know how to organize this in the best way. For example replace UserControl with other type of control, or storage, however I need to be sure it supports Storyboard animations inside. The code looks like this: if (bi.item.BitmapSource != null) { Image previewImage = new Image(); previewImage.Source = bi.item.BitmapSource; itemPane.ItemPreviewCanvas.Children.Add(previewImage); } else if (bi.item.VectorSource != null) { UserControl previewControl = bi.item.VectorSource; itemPane.ItemPreviewCanvas.Children.Add(previewControl); } Thanks in advance

    Read the article

  • After each command tmux prints: ps1_update: command not found

    - by B.I.
    On Linux Ubuntu 11.04, after each command (cd, ls, vim...) successful or not, tmux prints out as a last line ps1_update: command not found. Is there any config option I am missing? Thank you very much! tmux.conf # http://lukaszwrobel.pl/blog/tmux-tutorial-split-terminal-windows-easily # just remember that after every modification, tmux must be refreshed # to take new settings into account. # This can be achieved either by restarting it or by typing in: # tmux source-file .tmux.conf # Here is a list of a few basic tmux commands: # Ctrl+b " - split pane horizontally. # Ctrl+b % - split pane vertically. # Ctrl+b arrow key - switch pane. # Hold Ctrl+b, don't release it and hold one of the arrow keys - resize pane. # !Ctrl+b c - (c)reate a new window. # !Ctrl+b n - move to the (n)ext window. # Ctrl+b p - move to the (p)revious window. # Shift+LMB - select text. # ALT+Arrows to move among panes. # rebind default prefix to C-a unbind C-b set -g prefix C-a # use ALT+Arrows to move around panes bind -n M-Left select-pane -L bind -n M-Right select-pane -R bind -n M-Up select-pane -U bind -n M-Down select-pane -D # activity monitoring setw -g monitor-activity on set -g visual-activity on # highlight current pane set-window-option -g window-status-current-bg yellow # enable pane switching with mouse set-option -g mouse-select-pane on # read bashrc source ~/.bashrc # Sane scrolling set -g terminal-overrides 'xterm*:smcup@:rmcup@' commandline print out ($(cat)user@tiki:~/.vim$ ls autoload bash_profile bashrc bundle README.md tmux.conf vimrc xmonad xmonad-ubuntu-conf xsessionrc ps1_update: command not found ($(cat)user@tiki:~/.vim$ ll total 56 drwxrwxr-x 2 user user 4096 Mar 17 10:20 autoload/ -rw-rw-r-- 1 user user 170 Mar 17 10:20 bash_profile -rw-rw-r-- 1 user user 4004 Apr 2 11:37 bashrc drwxrwxr-x 20 user user 4096 Aug 20 10:55 bundle/ -rw-rw-r-- 1 user user 11170 Aug 20 11:24 README.md -rw-rw-r-- 1 user user 1243 Mar 17 10:20 tmux.conf ps1_update: command not found ($(cat)user@tiki:~/.vim$ And the following is plain terminal output, without tmux running user@tiki:~$ ls backup_list.md Documents Dropbox examples.desktop hakers_and_painters.md~ hyundai Music projects ror Ubuntu One Videos windows.sh Desktop Downloads elif.txt hakers_and_painters.md help.txt maqola.txt Pictures Public tmp update_background.sh VirtualBox VMs user@tiki:~$ ll total 116 -rw-rw-r-- 1 user user 380 Aug 9 17:34 backup_list.md drwxr-xr-x 6 user user 4096 Jul 15 09:26 Desktop/ drwxr-xr-x 2 user user 4096 Jul 7 11:26 Documents/ drwxr-xr-x 11 user user 20480 Aug 20 13:53 Downloads/ -rwx------ 1 user user 729 May 7 14:45 update_background.sh* drwxr-xr-x 2 user user 4096 Dec 10 2013 Videos/ drwxrwxr-x 4 user user 4096 Sep 10 2013 VirtualBox VMs/ -rwxrwxr-x 1 user user 36 Jan 11 2014 windows.sh* user@tiki:~$ cd Desktop/ user@tiki:~/Desktop$ ll total 36 -rw-rw-r-- 1 user user 3388 Jul 14 17:10 daily--report.md -rw-rw-r-- 1 user user 71 Jan 28 2014 fernandez readme.md -rw-rw-r-- 1 user user 23 Jan 28 2014 fernandez readme.md~ drwx------ 4 user user 4096 Mar 23 14:02 my_docs/ drwx------ 2 user user 4096 Feb 3 2014 Origami/ drwx------ 7 user user 4096 Feb 1 2013 Plants_vs._Zombies_v1.2.0.1065/ -rwxr-xr-x 1 user user 301 Apr 15 11:28 Sky Fight.desktop* drwx------ 2 user user 4096 Feb 11 2014 webdesign/ -rwxrwxr-x 1 user user 26 Jan 11 2014 windows.sh~* user@tiki:~/Desktop$

    Read the article

  • The Oracle Platform

    - by Naresh Persaud
    Today’s enterprises typically create identity management infrastructures using ad-hoc, multiple point solutions. Relying on point solutions introduces complexity and high cost of ownership leading many organizations to rethink this approach. In a recent worldwide study of 160 companies conducted by Aberdeen Research, there was a discernible shift in this trend as businesses are now looking to move away from the point solution approach from multiple vendors and adopt an integrated platform approach. By deploying a comprehensive identity and access management strategy using a single platform, companies are saving as much as 48% in IT costs, while reducing audit deficiencies by nearly 35%. According to Aberdeen's research, choosing an integrated suite or “platform” of solutions for Identity Management from a single vendor can have many advantages over choosing “point solutions” from multiple vendors. The Oracle Identity Management Platform is uniquely designed to offer several compelling benefits to our customers.  Shared Services: Instead of separate solutions for - Administration, Authentication, Authorization, Audit and so on–  Oracle Identity Management offers a set of share services that allows these services to be consumed by each component in the stack and by developers of new applications  Actionable Intelligence: The most compelling benefit of the Oracle platform is ” Actionable intelligence” which means if there is a compliance violation, the same platform can fix it. And If a user is logging in from an un-trusted device or we detect an attack and act proactively on that information. Suite Interoperability: With the oracle platform the components all connect and integrated with each other. So if an organization purchase the platform for provisioning and wants to manage access, then the same platform can offer access management which leads to cost savings. Extensible and Configurable: With point solutions – you typically get limited ability to extend the tool to address custom requirements. But with the Oracle platform all of the components have a common way to extend the UI and behavior Find out more about the Oracle Platform approach in this presentation. Platform approach-series-the oracleplatform-final View more PowerPoint from OracleIDM

    Read the article

  • Spend Analytics on a Grand Scale

    - by jacqueline.coolidge(at)oracle.com
    The Wall St. Journal reports in Billions in Bloat Uncovered in Beltway that a recent study by Government Accountability Office (GAO) has released a massive study of several programs and agencies that cost U.S. taxpayers billions of dollars each year.  This report help save $100 to $200 Billion dollars by identifying duplicate spending and ineffective programs that can be consolidated or eliminated. Now, that is spend analytics on a massive scale! It remains to be seen how actionable that information will be.  Certainly, there have been studies before that identify wasteful spending.  But, it’s a great case of the power of business intelligence and spend analytics.   Many companies do find significant savings when they implement spend and procurement analytics. What makes for an excellent spend analysis? It should be: Objective and provide visibility across programs and/or divisions A cross functional analysis that links financial with performance metrics Prescriptive and actionable Spend and procurement analytics have been HOT during the economic downturn! I expect 2011 will see many more companies get serious about spend analytics and would love to hear from companies who are willing to share their experience.

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >