Search Results

Search found 2245 results on 90 pages for 'prolog cut'.

Page 68/90 | < Previous Page | 64 65 66 67 68 69 70 71 72 73 74 75  | Next Page >

  • Webcast Q&A: Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter

    - by Kellsey Ruppel
    This week we had the fifth webcast in our WebCenter in Action webcast series, "Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter", where customers Giovani Dacumos and Minh Ong from the Los Angeles Department of Building & Safety (LADBS), and Sheetal Paranjpye and Rajiv Desai from Oracle Partner 3Di, shared how Oracle WebCenter is powering LADBS' externally facing website and providing a superior self-service experience for their customers. We asked the speakers to provide some dialogue for Q&A.   Giovani Dacumos, Director of Systems and Minh Ong, LADBS Q: Did you run into any issues when integrating all of the different applications together?A: Yes. We did have issues integrating a secure sign on between the portal and other legacy applications. We used portlets and iframes to overcome those.  This is a new technology for us and we are also learning as we go so there were a lot of challenges in developing and implementing our vision. Q: What has been the biggest benefit your end users have seen?A: The biggest benefit for our ends users is ease-of-use. We've given them a system that provided a new and improved source of information, as well as a very organized flow of transaction processing. It has made our online service very user friendly. Q: Was there any resistance internally when implementing the solution? If so, how did you overcome that?A: There was no internal resistance during the implementation, only challenges. As mentioned earlier, this is a new technology for us. We've come across issues that needed assistance from Oracle. Working with 3Di and Oracle has helped us tremendously to find solutions to our implementation issues. Q: Given the performance, what do you estimate to be the top end capacity of the system? A: With the current performance and architecture we have, we are able to support approx 300-400 concurrent users.  We would need more hardware to support additional user load. Q: What's the overview or summary of feedback from the users interacting with the site?A: LADBS has a wide spectrum of customers, from simple users like homeowners to large construction firms. Anything new that we offer could be a little bit challenging for some, but overall, the customers liked it. They saw a huge improvement on the usability. Q: Can you describe the impressions about the site before and after the project within LADBS?A: The old site was using old technology and it was hard for us to keep on building into it as we got more business requirements. It made our application seem a bit complicated.  It was confusing for our new customers to use and we've improved on this with the new site. It's now easier for them to complete their transactions and, at the same time, allowed us to provide more useful information. Sheetal Paranjpye and Rajiv Desai, 3Di Q: Did you run into any obstacles when implementing the solution?A: Yes we did run into some obstacles. One of the key show stoppers was the issue with portlet to portal communication. The GIS viewer (portlet) needed information to be passed  to and from Permit LA (Portal), but we were able to get everything configured and up and working quickly! Q: Was there a lot of custom work that needed to be done for this particular solution?A: We have done some customizations where workflows/ Task flows are involved.  Q: What do you think were the keys to success for rolling out WebCenter?A: Having a service oriented architecture and using portlets have been the key areas for rolling out Oracle WebCenter at LADBS. The Oracle WebCenter Content integration allows the flexibility to business users to maintain the content, which has really cut down on the reliance of IT, and employee productivity has increased as a result. If you missed the webcast, be sure to catch the replay to see a live demonstration of WebCenter in action! Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter from Oracle WebCenter

    Read the article

  • Links to my “Best of 2010” Posts

    - by ScottGu
    I hope everyone is having a Happy New Years! 2010 has been a busy blogging year for me (this is the 100th blog post I’ve done in 2010).  Several people this week suggested I put together a summary post listing/organizing my favorite posts from the year.  Below is a quick listing of some of my favorite posts organized by topic area: VS 2010 and .NET 4 Below is a series of posts I wrote (some in late 2009) about the VS 2010 and .NET 4 (including ASP.NET 4 and WPF 4) release we shipped in April: Visual Studio 2010 and .NET 4 Released Clean Web.Config Files Starter Project Templates Multi-targeting Multiple Monitor Support New Code Focused Web Profile Option HTML / ASP.NET / JavaScript Code Snippets Auto-Start ASP.NET Applications URL Routing with ASP.NET 4 Web Forms Searching and Navigating Code in VS 2010 VS 2010 Code Intellisense Improvements WPF 4 Add Reference Dialog Improvements SEO Improvements with ASP.NET 4 Output Cache Extensibility with ASP.NET 4 Built-in Charting Controls for ASP.NET and Windows Forms Cleaner HTML Markup with ASP.NET 4 - Client IDs Optional Parameters and Named Arguments in C# 4 - and a cool scenarios with ASP.NET MVC 2 Automatic Properties, Collection Initializers and Implicit Line Continuation Support with VB 2010 New <%: %> Syntax for HTML Encoding Output using ASP.NET 4 JavaScript Intellisense Improvements with VS 2010 VS 2010 Debugger Improvements (DataTips, BreakPoints, Import/Export) Box Selection and Multi-line Editing Support with VS 2010 VS 2010 Extension Manager (and the cool new PowerCommands Extension) Pinning Projects and Solutions VS 2010 Web Deployment Debugging Tips/Tricks with Visual Studio Search and Navigation Tips/Tricks with Visual Studio Visual Studio Below are some additional Visual Studio posts I’ve done (not in the first series above) that I thought were nice: Download and Share Visual Studio Color Schemes Visual Studio 2010 Keyboard Shortcuts VS 2010 Productivity Power Tools Fun Visual Studio 2010 Wallpapers Silverlight We shipped Silverlight 4 in April, and announced Silverlight 5 the beginning of December: Silverlight 4 Released Silverlight 4 Tools for VS 2010 and WCF RIA Services Released Silverlight 4 Training Kit Silverlight PivotViewer Now Available Silverlight Questions Announcing Silverlight 5 Silverlight for Windows Phone 7 We shipped Windows Phone 7 this fall and shipped free Visual Studio development tools with great Silverlight and XNA support in September: Windows Phone 7 Developer Tools Released Building a Windows Phone 7 Twitter Application using Silverlight ASP.NET MVC We shipped ASP.NET MVC 2 in March, and started previewing ASP.NET MVC 3 this summer.  ASP.NET MVC 3 will RTM in less than 2 weeks from today: ASP.NET MVC 2: Strongly Typed Html Helpers ASP.NET MVC 2: Model Validation Introducing ASP.NET MVC 3 (Preview 1) Announcing ASP.NET MVC 3 Beta and NuGet (nee NuPack) Announcing ASP.NET MVC 3 Release Candidate 1  Announcing ASP.NET MVC 3 Release Candidate 2 Introducing Razor – A New View Engine for ASP.NET ASP.NET MVC 3: Layouts with Razor ASP.NET MVC 3: New @model keyword in Razor ASP.NET MVC 3: Server-Side Comments with Razor ASP.NET MVC 3: Razor’s @: and <text> syntax ASP.NET MVC 3: Implicit and Explicit code nuggets with Razor ASP.NET MVC 3: Layouts and Sections with Razor IIS and Web Server Stack The IIS and Web Stack teams have made a bunch of great improvements to the core web server this year: Fix Common SEO Problems using the URL Rewrite Extension Introducing the Microsoft Web Farm Framework Automating Deployment with Microsoft Web Deploy Introducing IIS Express SQL CE 4 (New Embedded Database Support with ASP.NET) Introducing Web Matrix EF Code First EF Code First is a really nice new data option that enables a very clean code-oriented data workflow: Announcing Entity Framework Code-First CTP5 Release Class-Level Model Validation with EF Code First and ASP.NET MVC 3 Code-First Development with Entity Framework 4 EF 4 Code First: Custom Database Schema Mapping Using EF Code First with an Existing Database jQuery and AJAX Contributions My team began making some significant source code contributions to the jQuery project this year: jQuery Templates, Data Link and Globalization Accepted as Official jQuery Plugins jQuery Templates and Data Linking (and Microsoft contributing to jQuery) jQuery Globalization Plugin from Microsoft Patches and Hot Fixes Some useful fixes you can download prior to VS 2010 SP1: Patch for Cut/Copy “Insufficient Memory” issue with VS 2010 Patch for VS 2010 Find and Replace Dialog Growing Patch for VS 2010 Scrolling Context Menu Videos of My Talks Some recordings of technical talks I’ve done this year: ASP.NET 4, ASP.NET MVC, and Silverlight 4 Talks I did in Europe VS 2010 and ASP.NET 4 Web Forms Talk in Arizona Other About Technical Debates (and ASP.NET Web Forms and ASP.NET MVC debates in particular) ASP.NET Security Fix Now on Windows Update Upcoming Web Camps I’d like to say a big thank you to everyone who follows my blog – I really appreciate you reading it (the comments you post help encourage me to write it).  See you in the New Year! Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • SQLAuthority News – Tips for Traveling to Nepal

    - by pinaldave
    If you are a regular reader of this blog, you might know that I travel nearly 20+ days out of 30 days in a month. There are cases when I don’t have a chance to go home for an entire month and my family has to travel to different cities just to meet me. During my recent visit, one of my acquaintances suggested that I should blog about my travel experiences as well. This can be helpful to others who are traveling to the country or city. I have previously written about my experience about all the airlines in India. I would be writing about a few tips about traveling to the beautiful country Nepal today. Kathmandu, the capital of Nepal is very scenic. There are lots of historical places to see and visit. I was fortunate enough to stopover the Pashupatinath Temple, Bhaktapur, Vasantpur and the temple of Kumari Goddess. I also visited casinos there, but even if  I have stayed in Las Vegas for 3 and a half years before, I was not keen on them so I left the casinos just like what I did in Las Vegas . I also traveled to the famous Thamel area by car. Here are my quick tips for anyone who is planning to visit Nepal. They are not categorized but just written in the order that came to my mind. Please note that if you are an Indian, you will get a special privilege everywhere in Nepal, beginning right from the Indian airports. Use the expression “Nameste!” If you want to greet any Indian or Nepali. Indian Nationals do not need visa/passport to enter Nepal. In fact, Indian Nationals can just walk in to Nepal without any passport; but should have any valid Indian ID. There is no use of a passport since it will not be stamped at any immigration ports, whether in India or Nepal. Indian currency is widely accepted everywhere. However, please bring only Rs. 100 bills/notes as Rs. 500 or Rs. 1000 are not accepted. However, casinos there will accept larger bills. Indian National Language – Hindi is widely spoken and understood everywhere. I did not find a single person who had trouble speaking it. Nepali language uses the scripting language as Devnagari, which is similar to Hindi. Here, you will find food of almost every country.  The taste of Nepali food is authentic and very delicious. It is very safe to travel and move around in Kathmandu (despite what media suggests). However, it will really help if you have a friend who speaks Nepali. You can negotiate a few deals and cut off to almost 1/5 of the original quoted price of products sold here. If you are from Gujarat, India – you will find Nepali language sharing many common words. Temples are everywhere, so do not miss to visit a few of them. Pashupatinath is a must. Only followers of Hindu religion (from Nepal and India only) are allowed in most of the holy places. Camera is allowed everywhere except on the holy places. Now it is your turn to share your opinions or any suggestions. I think Nepal is a great country as there are lots of places to visit. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, T SQL, Technology

    Read the article

  • Automatic Properties, Collection Initializers, and Implicit Line Continuation support with VB 2010

    - by ScottGu
    [In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu] This is the eighteenth in a series of blog posts I’m doing on the upcoming VS 2010 and .NET 4 release. A few days ago I blogged about two new language features coming with C# 4.0: optional parameters and named arguments.  Today I’m going to post about a few of my favorite new features being added to VB with VS 2010: Auto-Implemented Properties, Collection Initializers, and Implicit Line Continuation support. Auto-Implemented Properties Prior to VB 2010, implementing properties within a class using VB required you to explicitly declare the property as well as implement a backing field variable to store its value.  For example, the code below demonstrates how to implement a “Person” class using VB 2008 that exposes two public properties - “Name” and “Age”:   While explicitly declaring properties like above provides maximum flexibility, I’ve always found writing this type of boiler-plate get/set code tedious when you are simply storing/retrieving the value from a field.  You can use VS code snippets to help automate the generation of it – but it still generates a lot of code that feels redundant.  C# 2008 introduced a cool new feature called automatic properties that helps cut down the code quite a bit for the common case where properties are simply backed by a field.  VB 2010 also now supports this same feature.  Using the auto-implemented properties feature of VB 2010 we can now implement our Person class using just the code below: When you declare an auto-implemented property, the VB compiler automatically creates a private field to store the property value as well as generates the associated Get/Set methods for you.  As you can see above – the code is much more concise and easier to read. The syntax supports optionally initializing the properties with default values as well if you want to: You can learn more about VB 2010’s automatic property support from this MSDN page. Collection Initializers VB 2010 also now supports using collection initializers to easily create a collection and populate it with an initial set of values.  You identify a collection initializer by declaring a collection variable and then use the From keyword followed by braces { } that contain the list of initial values to add to the collection.  Below is a code example where I am using the new collection initializer feature to populate a “Friends” list of Person objects with two people, and then bind it to a GridView control to display on a page: You can learn more about VB 2010’s collection initializer support from this MSDN page. Implicit Line Continuation Support Traditionally, when a statement in VB has been split up across multiple lines, you had to use a line-continuation underscore character (_) to indicate that the statement wasn’t complete.  For example, with VB 2008 the below LINQ query needs to append a “_” at the end of each line to indicate that the query is not complete yet: The VB 2010 compiler and code editor now adds support for what is called “implicit line continuation support” – which means that it is smarter about auto-detecting line continuation scenarios, and as a result no longer needs you to explicitly indicate that the statement continues in many, many scenarios.  This means that with VB 2010 we can now write the above code with no “_” at all: The implicit line continuation feature also works well when editing XML Literals within VB (which is pretty cool). You can learn more about VB 2010’s Implicit Line Continuation support and many of the scenarios it supports from this MSDN page (scroll down to the “Implicit Line Continuation” section to find details). Summary The above three VB language features are but a few of the new language and code editor features coming with VB 2010.  Visit this site to learn more about some of the other VB language features coming with the release.  Also subscribe to the VB team’s blog to learn more and stay up-to-date with the posts they the team regularly publishes. Hope this helps, Scott

    Read the article

  • Raspberry Pi entrance signed backed by Umbraco - Part 1

    - by Chris Houston
    Being experts on all things Umbraco, we jumped at the chance to help our client, QV Offices, with their pressing signage predicament. They needed to display a sign in the entrance to their building and approached us for our advice. Of course it had to be electronic: displaying multiple names of their serviced office clients, meeting room bookings and on-the-pulse promotions. But with a winding Victorian staircase and minimal storage space how could the monitor be run, updated and managed? That’s where we came in…Raspberry PiUmbraco CMSAutomatic updatesAutomated monitor of the signPower saving when the screen is not in useMounting the screenThe screen that has been used is a standard LED low energy Full HD screen and has been mounted on the wall using it's VESA mounting points, as the wall is a stud wall we were able to add an access panel behind the screen to feed through the mains, HDMI and sensor cables.The Raspberry Pi is then tucked away out of sight in the main electrical cupboard which just happens to be next to the sign, we had an electrician add a power point inside this cupboard to allow us to power the screen and the Raspberry Pi.Designing the interface and editing the contentAlthough a room sign was the initial requirement from QV Offices, their medium term goal has always been to add online meeting booking to their website and hence we suggested adding information about the current and next day's meetings to the sign that would be pulled directly from their online booking system.We produced the design and built the web page to fit exactly on a 1920 x 1080 screen (Full HD in Portrait)As you would expect all the information can be edited via an Umbraco CMS, they are able to add floors, rooms, clients and virtual clients as well as add meeting bookings to their meeting diary.How we configured the Raspberry PiAfter receiving a new Raspberry Pi we downloaded the latest release of Raspbian operating system and followed the official guide which shows how to copy the OS onto an SD card from a Mac, we then followed the majority of steps on this useful guide: 10 Things to Do After Buying a Raspberry Pi.Installing ChromiumWe chose to use the Chromium web browser which for those who do not know is the open sourced version of Google Chrome. You can install this from the terminal with the following command:sudo apt-get install chromium-browserInstalling UnclutterWe found this little application which automatically hides the mouse pointer, it is used in the script below and is installed using the following command:sudo apt-get install unclutterAuto start Chromium and disabling the screen saver, power saving and mouseWhen the Raspberry Pi has been installed it will not have a keyboard or mouse and hence if their was a power cut we needed it to always boot and re-loaded Chromium with the correct URL.Our preferred command line text editor is Nano and I have assumed you know how to use this editor or will be able to work it out pretty quickly.So using the following command:sudo nano /etc/xdg/lxsession/LXDE/autostartWe then changed the autostart file content to:@lxpanel --profile LXDE@pcmanfm --desktop --profile LXDE@xscreensaver -no-splash@xset s off@xset -dpms@xset s noblank@chromium --kiosk --incognito http://www.qvoffices.com/someURL@unclutter -idle 0The first few commands turn off the screen saver and power saving, we then open Cromium in Kiosk Mode (full screen with no menu etc) and pass in the URL to use (I have changed the URL in this example) We found a useful blog post with the Cromium command line switches.Finally we also open an application called Unclutter which auto hides the mouse after 0 seconds, so you will never see a mouse on the sign.We also had to edit the following file:sudo nano /etc/lightdm/lightdm.confAnd added the following line under the [SeatDefault] section:xserver-command=X -s 0 dpmsRefreshing the screenWe decided to try and add a scheduled task that would trigger Chromium to reload the page, at some point in the future we might well change this to using Javascript to update the content, but for now this works fine.First we installed the XDOTool which enables you to script Keyboard commands:sudo apt-get install xdotoolWe used the Refreshing Chromium Browser by Shell Script post as a reference and created the following shell script (which we called refreshing.sh):export DISPLAY=":0"WID=$(xdotool search --onlyvisible --class chromium|head -1)xdotool windowactivate ${WID}xdotool key ctrl+F5This selects the correct display and then sends a CTRL + F5 to refresh Chromium.You will need to give this file execute permissions:chmod a=rwx refreshing.shNow we have the script file setup we just need to schedule it to call this script periodically which is done by using Crontab, to edit this you use the following command:crontab -eAnd we added the following:*/5 * * * * DISPLAY=":.0" /home/pi/scripts/refreshing.sh >/home/pi/cronlog.log 2>&1This calls our script every 5 minutes to refresh the display and it logs any errors to the cronlog.log file.SummaryQV Offices now have a richer and more manageable booking system than they did before we started, and a great new sign to boot.How could we make sure that the sign was running smoothly downstairs in a busy office centre? A second post will follow outlining exactly how Vizioz enabled QV Offices to monitor their sign simply and remotely, from the comfort of their desks.

    Read the article

  • Making a Job Change That's Easy Why Not Try a Career Change

    - by david.talamelli
    A few nights ago I received a comment on one of our blog posts that reminded me of a statistic that I heard a while back. The statistic reflected the change in our views towards work and showed how while people in past generations would stay in one role for their working career - now with so much choice people not only change jobs often but also change careers 4-5 times in their working life. To differentiate between a job change and a career change: when I say job change this could be an IT Sales person moving from one IT Sales role to another IT Sales role. A Career change for example would be that same IT Sales person moving from IT Sales to something outside the scope of their industry - maybe to something like an Engineer or Scuba Dive Instructor. The reason for Career changes can be as varied as the people who make them. Someone's motivation could be to pursue a passion or maybe there is a change in their personal circumstances forcing the change or it could be any other number of reasons. I think it takes courage to make a Career change - it can be easy to stay in your comfort zone and do what you know, but to really push yourself sometimes you need to try something new, it is a matter of making that career transition as smooth as possible for yourself. The comment that was posted is here below (thanks Dean for the kind words they are appreciated). Hi David, I just wanted to let you know that I work for a company called Milestone Search in Melbourne, Victoria Australia. (www.mstone.com.au) We subscribe to your feed on a daily basis and find your blogs both interesting and insightful. Not to mention extremely entertaining. I wonder if you have missed out on getting in journalism as this seems to be something you'd be great at ?: ) Anyways back to my point about changing careers. This could be anything from going from I.T. to Journalism, Engineering to Teaching or any combination of career you can think of. I don't think there ever has been a time where we have had so many opportunities to do so many different things in our working life. While this idea sounds great in theory, putting it into practice would be much harder to do I think. First, in an increasingly competitive job market, employers tend to look for specialists in their field. You may want to make a change but your options may be limited by the number of employers willing to take a chance on someone new to an industry that will likely require a significant investment in time to get brought up to speed. Also, using myself as an example if I was given the opportunity to move into Journalism/Communication/Marketing career from my career as an IT Recruiter - realistically I would have to take a significant pay cut to make this change as my current salary reflects the expertise I have in my current career. I would not immediately be up to speed moving into a new career and would not be able to justify a similar salary. Yes there are transferable skills in any career change, but even though you may have transferable skills you must realise that you will also have a large amount of learning to do which would take time. These are two initial hurdles that I immediately think of, there may be more but nothing is insurmountable. If you work out what you want to do with your working career whatever that may be, you then need to just need to work out the steps to get to your end goal. This is where utilising the power of your networks and using Social Media can come in handy. If you are interested in working somewhere why not proactively take the opportunity to research the industry or company - find out who it is you need to speak to and get in touch with them. We spend so much time working, we should enjoy the work we do and not be afraid to try new things. Waiting for your dream job to fall into your lap or be handed to you on a silver platter is not likely going to happen, so if there is something you do want to do, work out a plan to make it happen and chase after it. This article was originally posted on David Talamelli's Blog - David's Journal on Tap

    Read the article

  • OWB 11gR2 &ndash; OLAP and Simba

    - by David Allan
    Oracle Warehouse Builder was the first ETL product to provide a single integrated and complete environment for managing enterprise data warehouse solutions that also incorporate multi-dimensional schemas. The OWB 11gR2 release provides Oracle OLAP 11g deployment for multi-dimensional models (in addition to support for prior releases of OLAP). This means users can easily utilize Simba's MDX Provider for Oracle OLAP (see here for details and cost) which allows you to use the powerful and popular ad hoc query and analysis capabilities of Microsoft Excel PivotTables® and PivotCharts® with your Oracle OLAP business intelligence data. The extensions to the dimensional modeling capabilities have been built on established relational concepts, with the option to seamlessly move from a relational deployment model to a multi-dimensional model at the click of a button. This now means that ETL designers can logically model a complete data warehouse solution using one single tool and control the physical implementation of a logical model at deployment time. As a result data warehouse projects that need to provide a multi-dimensional model as part of the overall solution can be designed and implemented faster and more efficiently. Wizards for dimensions and cubes let you quickly build dimensional models and realize either relationally or as an Oracle database OLAP implementation, both 10g and 11g formats are supported based on a configuration option. The wizard provides a good first cut definition and the objects can be further refined in the editor. Both wizards let you choose the implementation, to deploy to OLAP in the database select MOLAP: multidimensional storage. You will then be asked what levels and attributes are to be defined, by default the wizard creates a level bases hierarchy, parent child hierarchies can be defined in the editor. Once the dimension or cube has been designed there are special mapping operators that make it easy to load data into the objects, below we load a constant value for the total level and the other levels from a source table.   Again when the cube is defined using the wizard we can edit the cube and define a number of analytic calculations by using the 'generate calculated measures' option on the measures panel. This lets you very easily add a lot of rich analytic measures to your cube. For example one of the measures is the percentage difference from a year ago which we can see in detail below. You can also add your own custom calculations to leverage the capabilities of the Oracle OLAP option, either by selecting existing template types such as moving averages to defining true custom expressions. The 11g OLAP option now supports percentage based summarization (the amount of data to precompute and store), this is available from the option 'cost based aggregation' in the cube's configuration. Ensure all measure-dimensions level based aggregation is switched off (on the cube-dimension panel) - previously level based aggregation was the only option. The 11g generated code now uses the new unified API as you see below, to generate the code, OWB needs a valid connection to a real schema, this was not needed before 11gR2 and is a new requirement since the OLAP API which OWB uses is not an offline one. Once all of the objects are deployed and the maps executed then we get to the fun stuff! How can we analyze the data? One option which is powerful and at many users' fingertips is using Microsoft Excel PivotTables® and PivotCharts®, which can be used with your Oracle OLAP business intelligence data by utilizing Simba's MDX Provider for Oracle OLAP (see Simba site for details of cost). I'll leave the exotic reporting illustrations to the experts (see Bud's demonstration here), but with Simba's MDX Provider for Oracle OLAP its very simple to easily access the analytics stored in the database (all built and loaded via the OWB 11gR2 release) and get the regular features of Excel at your fingertips such as using the conditional formatting features for example. That's a very quick run through of the OWB 11gR2 with respect to Oracle 11g OLAP integration and the reporting using Simba's MDX Provider for Oracle OLAP. Not a deep-dive in any way but a quick overview to illustrate the design capabilities and integrations possible.

    Read the article

  • SWFObject and IE6 causing hair-pulling agony

    - by Piet
    I recently used SWFObject to display a flash header on a website. I chose SWFObject because: Instead of displaying an annoying ‘Install flash now’ message, it claims to be able to show alternate content. In this case: the original header image. It claims to be compatible with more or less every browser out there. Implementation went fine, until someone tested it on IE6 and got the following error: Internet explorer cannot open the Internet site http://www….. Operation aborted Which basically means that the site just can’t be visited with IE6 (still used a lot in business environments), it even seems as if there’s something wrong with your internet connection. Now, since about 10% of visitors to this site are still using IE6 (why does everyone still use Internet Explorer ???? Do YOU know that these days most people do NOT use Internet Explorer anymore ?) Now after some googling, I found the suggestion to defer loading of the SWFObject.js as follows: <script type="text/javascript" defer=”defer” src=”http://ajax.googleapis.com/ajax/libs/swfobject/2.2/swfobject.js” </script> <script type=”text/javascript” defer=”defer” swfobject.registerObject(”myId”, “9″, “”); </script> What this does according to W3C: When set, this boolean attribute provides a hint to the user agent that the script is not going to generate any document content (e.g., no “document.write” in javascript) and thus, the user agent can continue parsing and rendering. I don’t know exactly why, but: HURRAY! It works now!!! Only… IE6 and IE7 (didn’t try IE8) now gave the following error: Line: 19 Char: 1 Error: ’swfobject’ is undefined Code: 0 URL: http://www… But the flash was still running fine. Still, such an error isn’t clean, especially since almost half of the site’s visitors are using one of these Internet Explorer versions. Now, wanting a quick fix I decided to do the following: <script type="text/javascript" defer="defer" if (typeof(swfobject) != "undefined") swfobject.registerObject("myId", "9", ""); </script> I admit this is a bit of a weird ‘fix’. You’d suspect the flash to stop working on IE6/IE7, which it doesn’t. Not planning on diving into it’s inner bowels, I regard this a ‘mission accomplished’ until someone somewhere posts a better solution (for which I setup some Google alerts). Do you have a better solution? What would be the impact on the webdev economy (or your life) if all browsers were compatible? Addendum Because the above turned out not to work with the new Firefox 3.5.3 (strangely, was OK with 3.5.2 when I tested it) I decided to cut the crap and use the ‘Dynamic Publishing’ way. Ok, so it won’t work for people who have javascript disabled, but who on earth would have flash installed AND javascript disabled? To avoid the IE6 error with the ‘Dynamic Publishing’ way, I call swfobject.embedSWF right after the div that will be replaced with the flash content. Calling swfobject.embedSWF in the <head> would otherwise give me the above error in IE6 again.

    Read the article

  • From the Tips Box: Pre-installation Prep Work Makes Service Pack Upgrades Smoother

    - by Jason Fitzpatrick
    Last month Microsoft rolled out Windows 7 Service Pack 1 and, like many SP releases, quite a few people are hanging back to see what happens. If you want to update but still error on the side of caution, reader Ron Troy  offers a step-by-step guide. Ron’s cautious approach does an excellent job minimizing the number of issues that could crop up in a Service Pack upgrade by doing a thorough job updating your driver sets and clearing out old junk before you roll out the update. Read on to see how he does it: Just wanted to pass on a suggestion for people worried about installing Service Packs.  I came up with a ‘method’ a couple years back that seems to work well. Run Windows / Microsoft Update to get all updates EXCEPT the Service Pack. Use Secunia PSI to find any other updates you need. Use CCleaner or the Windows disk cleanup tools to get rid of all the old garbage out there.  Make sure that you include old system updates. Obviously, back up anything you really care about.  An image backup can be real nice to have if things go wrong. Download the correct SP version from Microsoft.com; do not use Windows / Microsoft Update to get it.  Make sure you have the 64 bit version if that’s what you have installed on your PC. Make sure that EVERYTHING that affects the OS is up to date.  That includes all sorts of drivers, starting with video and audio.  And if you have an Intel chipset, use the Intel Driver Utility to update those drivers.  It’s very quick and easy.  For the video and audio drivers, some can be updated by Intel, some by utilities on the vendor web sites, and some you just have to figure out yourself.  But don’t be lazy here; old drivers and Windows Service Packs are a poor mix. If you have 3rd party software, check to see if they have any updates for you.  They might not say that they are for the Service Pack but you cut your risk of things not working if you do this. Shut off the Antivirus software (especially if 3rd party). Reboot, hitting F8 to get the SafeMode menu.  Choose SafeMode with Networking. Log into the Administrator account to ensure that you have the right to install the SP. Run the SP.  It won’t be very fancy this way.  Maybe 45 minutes later it will reboot and then finish configuring itself, finally letting you log in. Total installation time on most of my PC’s was about 1 hour but that followed hours of preparation on each. On a separate note, I recently got on the Nvidia web site and their utility told me I had a new driver available for my GeForce 8600M GS.  This laptop had come with Vista, now has Win 7 SP1.  I had a big surprise from this driver update; the Windows Experience Score on the graphics side went way up.  Kudo’s to Nvidia for doing a driver update that actually helps day to day usage.  And unlike ATI’s updates (which I need for my AGP based system), this update was fairly quick and very easy.  Also, Nvidia drivers have never, as I can recall, given me BSOD’s, many of which I’ve gotten from ATI (TDR errors).How to Enable Google Chrome’s Secret Gold IconHTG Explains: What’s the Difference Between the Windows 7 HomeGroups and XP-style Networking?Internet Explorer 9 Released: Here’s What You Need To Know

    Read the article

  • Hello World Pagelet

    - by astemkov
    Introduction The goal of this exercise is to give you a basic feel of how you can use Pagelet Producer to proxy a web page We will proxy a simple static Hello World web page, cut one section out of that page and present it as a pagelet that you can later insert on your own application page or to your portal page such as WebCenter Portal space or WebCenter Interaction community page. Hello World sample app This is the static web page we will work with: Let's assume the following: The Hello World web page is running on server http://appserver.company.com:1234/ The Hello World web page path is: http://appserver.company.com:1234/helloworld/ Initial Pagelet Producer setup Let's assume that the Pagelet Producer server is running on http://pageletserver.company.com:8889/pagelets/ First let's check that Pagelet Producer is up and running. In order to do that we just need to access the following URL: http://pageletserver.company.com:8889/pagelets/ And this is what should be returned: Now you can access Pagelet Producer administration screens using this URL: http://pageletserver.company.com:8889/pagelets/admin This is how the UI looks: Now if you connect to the internet via proxy server, you need to configure proxy in Pagelet Producer settings. In the Navigator pane: Jump To - Settings Click on "Proxy" Enter your proxy server configuration: Creating a resource First thing that you need to do is to create a resource for your web page. This will tell Pagelet Producer that all sub-paths of the web page should be proxied. It also will allow you to setup common rules of how your web page should be proxied and will serve as a container for your pagelets. In the Navigator pane: Jump To - Resources Click on any existing resource (ex. welcome_resource) Click on "Create selected type" toolbar button at the top of the Navigator pane Select "Web" in the "Select Producer Type" dialog box and click "OK" Now after the resource is created let's click on "General" sub-item a specify the following values Name = AppServer Source URL = http://appserver.company.com:1234/ Destination URL = /appserver/ Click on "Save" toolbar button at the top of the Navigator pane After the resource is created our web page becomes accessible by the URL: http://pageletserver.company.com:8889/pagelets/appserver/helloworld/ So in original web page address Source URL is replaced with Pagelet Producer URL (http://pageletserver.company.com:8889/pagelets) + Destination URL Creating a pagelet Now let's create "Hello World" pagelet. Under the resource node activate Pagelets subnode Click on "Create selected type" toolbar button at the top of the Navigator pane Click on "General" sub-node of newly created pagelet and specify the following values Name = Hello_World Library = MyLib Library is used for logical grouping. The portals use the "Library" value to group pagelets in their respective UI's. For example, when adding pagelets to a WebCenter Portal space you would see the individual pagelets listed under the "Library" name. URL Suffix = helloworld/index.html this is where the Hello World page html is served from Click on "Save" toolbar button at the top of the Navigator pane The Library name can be anything you want, it doesn't have to match the resource name at all. It is used as a logical grouping of pagelets, and you can include pagelets from multiple resources into the same library or create a new library for each pagelet. After you save the pagelet you can access it here: http://pageletserver.company.com:8889/pagelets/inject/v2/pagelet/MyLib/Hello_World which is : http://pageletserver.company.com:8889/pagelets/inject/v2/pagelet/ + [Library] + [Name] Or to test the injection of a pagelet into iframe you can click on the pagelets "Documentation" sub-node and use "Access Pagelet using REST" URL: This is what we will see: Clipping The pagelet that we just created covers the whole web page, but we want just the "Hello World" segment of it. So let's clip it. Under the Hello_World pagelet node activate Clipper sub-node Click on "Create selected type" toolbar button at the top of the Navigator pane Specify a Name for newly created clipper. For example: "c1" Click on "Content" sub-node of the clipper Click on "Launch Clipper" button New browser window will open By moving a mouse pointer over the web page select the area you want to clip: Click left mouse button - the browser window will disappear and you will see that Clipping Path was automatically generated Now let's save and access the link from the "Documentation" page again Here's our pagelet nicely clipped and ready for being used on your Web Center Space

    Read the article

  • Tough Decisions

    - by Johnm
    There was once a thriving business that employed two Database Administrators, Sam and Jim. Both DBAs were certified, educated and highly talented in their skill sets. During lunch breaks these two DBAs were often found together discussing best practices, troubleshooting techniques and the latest release notes for the upcoming version of SQL Server. They genuinely loved what they did. The maintenance of the first database was the responsibility of Sam. He was the architect of this server's setup and he was very meticulous in its configuration. He regularly monitored the health of the database, validated backup files and regularly adhered to the best practices that were advocated by well respected professionals. He was very proud of the fact that there was never a database that he managed that lost data or performed poorly. The maintenance of the second database was the responsibility of Jim. He too was the architect of this server's setup. At the time that he built this server, his understanding of the finer details of configuration were not as clear as they are today. The server was build on a shoestring budget and with very little time for testing and implementation. Jim often monitored the health of the database; but in more of a reactionary mode due to user complaints of slowness or failed transactions. Deadlocks abounded and the backup files were never validated. One day, the announcement was made that revealed that the business had hit financially hard times. Budgets were being cut, limitation on spending was implemented and the reduction in full-time staff was required. Since having two DBAs was regarded a luxury by many, this meant that either Sam or Jim were about to find themselves out of a job. Sam and Jim's boss, Frank, was faced with a very tough decision. Sam's performance was flawless. His techniques and practices were perfection. The databases he managed were reliable and efficient. His solutions are "by the book". When given a task it is certain that, while it may take a little longer, it will be done right the first time. Jim's techniques and practices were not perfect; but effective and responsive. He made mistakes regularly; but he shows that he learns from them and they often result in innovative solutions. When given a task it is certain that, while the results may require some tweaking, it will be done on time and under budget. You are Frank's best friend. He approaches you and presents this scenario. He must layoff one of his valued DBAs the very next morning. Frank asks you: "All else being equal, who would you let go? and Why?" Another pertinent question is raised: "Regardless of good times or bad, if you had to choose, which DBA would you want on your team when tough challenges arise?" Your response is. (This is where you enter a comment below)

    Read the article

  • How do I fix my resolution after Directx install through Steam?

    - by Justin
    I'm a bit long-winded so see bottom for quick version and specs. Friendly Hello: Hello all on these askUbuntu pages, I just recently built my own computer and decided to switch to Ubuntu for the extra coolness. I've been learning a lot through all this, and mostly been trying to figure out issues on my own (read: Google searches). However, I couldn't seem to find others with this problem so I've come here for help. Detailed Recount: So I just used WINE and WINETRICKS to install Steam. All went well and it worked. Then I went to trying a game out. I remembered that Orcs Must Die! worked from http://www.steamgamesonlinux.com/ so I tried that out. After selecting to download it, that's when the problem occurred. The screen suddenly zoomed in!!! I think it's the resolution right? Half the screen is cut off and I can't see parts of the right side of windows. My theory is that this is due to Direct X being installed through Steam, as Steam automatically installed it as I chose to download the game. It didn't even ask me to install Direct X or not ): It all happened so fast. This all being said, the game works fine! It looks a little strange, as if the resolution was off, but it plays just fine. What I did so far: Restarted my computer. Didn't work -_- Researched Steam installing DirectX on Ubuntu then messing up resolution and couldn't really find anything. Researched uninstalling DirectX from Ubuntu but only found uninstalling DirectX after having been installed with Wine, not through Steam. Got mad and ate my feelings. Tried "xrandr -s 0" but it didn't do anything. Ran xrandr alone and terminal showed this: Screen 0: minimum 8 x 8, current 640 x 480, maximum 16384 x 16384 DVI-I-0 connected 640x480+0+0 (normal left inverted right x axis y axis) 0mm x 0mm 640x480 59.9*+ 320x240 120.1 DVI-I-1 disconnected (normal left inverted right x axis y axis) HDMI-0 disconnected (normal left inverted right x axis y axis) DP-0 disconnected (normal left inverted right x axis y axis) DVI-D-0 disconnected (normal left inverted right x axis y axis) DP-1 disconnected (normal left inverted right x axis y axis) About now I was mad so I played Odin's Sphere then took a nap. Back to it! I entered the following: xrandr --output DVI-I-0 --mode 1024x768 But I was met with this message: xrandr: cannot find mode 1024x768 I get the same messages for 800x600, 1400x1050, and seemingly any other combination of numbers. I then tried Going into System Settings then Displays, then playing around in there. My Resolution is set to 640x480 and there are no other options for me to choose from. Rotation has Normal, Clockwise, Counter Clockwise, and 180 Degrees. It's set to Normal and I haven't messed with that. Launcher Placement has Unknown and All Displays as its two options. It's set to Unknown, but moving it to All Displays doesn't seem to do anything. Finally, when I click Detect Displays, nothing seems to happen. Quick Version: Linux noob. Steam installed with Wine and Winetricks. Steam downloaded and installed game + DirectX. Resolution messed up now (I think; pretty sure), can't fix it, very annoying, no idea what's going on, halp! Specs: Ubuntu Version 12.04 Wine Version 1.4.1 Have not changed any settings in Wine Using Winetricks Graphics Card: http://www.gigabyte.com/products/pro...px?pid=4361#sp Drivers: Proprietary (Installing those were a LOT of fun) Also let it be known that I have a DVI to VGA cord running from my Graphics card to my monitor. If any more information is needed I am ready to report. Thank You: Thanks a lot for your help and all the work you do to support noob ubuntuers like me (:

    Read the article

  • SQL SERVER – Free Print Book on SQL Server Joes 2 Pros Kit

    - by pinaldave
    Rick Morelan and I were discussing earlier this month that what we can give back to the community. We believe our books are very much successful and very well received by the community. The five books are a journey from novice to expert. The books have changed many lives and helped many get jobs as well pass the SQL Certifications. Rick is from Seattle, USA and I am from Bangalore, India. There are 12 hours difference between us. We try to do weekly meeting to catch up on various personal and SQL related topics. Here is one of our recent conversations. Rick and Pinal Pinal: Good Morning Rick! Rick: Good Morning…err… Good Evening to you – Pinal! Pinal: Hey Rick, did you read the recent email which I sent you – one of our reader is thanking us for writing Joes 2 Pros series. He wants to dedicate his success to us. Can you believe it? Rick: Yeah, he is very kind but did you tell him that it is all because of his hard work on learning subject and we have very little contribution in his success. Pinal: Absolutely, I told him the same – I said we just wrote the book but it is he who learned from it and proved himself in his job. It is all him! We were just igniters. Rick: Good response. Pinal: Hey Rick! Are we doing enough for the community? What can we do more? Rick: Hmmm… Let us do something more. Pinal: Remember once we discussed the idea of if anyone who buys our Joes 2 Pros Combo Kit in the next 2 weeks – we will send them SQL Wait Stats for free. What do you say? Rick: I agree! Great Idea! Let us do it. Free Giveaway Well Rick and I liked the idea of doing more. We have decided to give away free SQL Server Wait Stats books to everybody who will purchase Joes 2 Pros Combo Kit between today (Oct 15, 2012) and Oct 26, 2012. This is not a contest or a lucky winner opportunity. Everybody who participates will qualify for it. Combo Availability USA – Amazon India - Flipkart | Indiaplaza Note1: USA kit contains FREE 5 DVDs. India Kit does not contain 5 DVDs due to legal issues. Note2: Indian Kit is priced at special Indian Economic Price. Qualify for Free Giveaway You must have purchased our Joes 2 Pros Combo Kit of 5 books between Oct 15, 2012 and Oct 26, 2012. Purchase before Oct 15, 2012 and after Oct 26, 2012 will not qualify for this giveaway. Send your original receipt (email, order details) to following addresses: “[email protected];[email protected]” with the subject line “Joes 2 Pros Kit Promotion Free Offer”. Do not change the subject line or your email may be missed.  Clearly mention your shipping address with phone number and pin/zip code. Send your receipt before Oct 30, 2012. We will not entertain any conversation after Oct 30, 2012 cut off date. The Free books will be sent to USA and India address only. Availability USA - Amazon | India - Flipkart | Indiaplaza Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Book Review, SQLServer, T SQL, Technology

    Read the article

  • A Letter for Your CEO About Social Marketing’s Future

    - by Mike Stiles
    We’ll leave it to you to decide if or how to sneak this in front of them. Dear Chief: This social marketing thing looks serious. It’s gone beyond having a Facebook page and putting our info and a few promotions on it. It’s seriously disrupting how we’ve always done marketing. And its implications reach well beyond marketing. My concern is that we stay positioned ahead of these changes and are prepared to embrace, adapt and capitalize on these new capabilities as opposed to spending valuable time and money trying to shoehorn social into “the way we’ve always done things.” I’m also concerned about what happens if our competition executes on this before we do. The days of being able to impose our ad messaging on the masses to great effect are numbered. The public now has the tech tools and ability to filter out things that are irrelevant to them. And frankly, spending ad dollars to reach unlikely prospects isn’t the most efficient path for us either. Today, our customers have to genuinely love what we do. That starts with a renewed, customer-centric focus on the quality and usability of our product. If their experience with it is bad, they now have very connected, loud voices that will testify against us. We can’t afford that. Next, their customer service experience, before and after the sale, has to be a pleasant surprise. That requires truly knowing our customers and listening to them. Lip service won’t cut it. We have to get and use as much data on the customer as possible, interact with them wherever they want to interact with us, and commit to impressing them. If we do, they’ll get out there and advertise for us. Since peer-to-peer recommendation is the most effective marketing, that’s money in the bank. Social marketing is about forming relationships, same as how individuals use social. We want them to know us, trust us, and get real value from knowing us. That requires honesty and transparency that before now might have been uncomfortable. I propose that if we clearly make everything we do about our customers’ wants and needs, we’ll have nothing to hide. It will solidify customer loyalty, retention, and thus, revenue. These things can’t happen without certain tools and structural changes in the organization. There are social cloud platforms that integrate social management into all of the necessary areas: CRM, customer service, sales, marketing automation, content marketing, ecommerce, etc. This is will give us a real-time, complete view of the customer so their every interaction with us is attentive, personalized, accurate, relevant, and satisfying. Without it, we’re just a collage of disjointed systems, each gathering data that informs only its own departmental silo. The customer is voluntarily giving us everything we need to know about them to win them over, but we have to start listening and putting the pieces together. There’s still time. Brands are coming to terms with this transition to the socially enabled enterprise, but so far they aren’t moving very fast. Like us, they’re dealing with long-entrenched technologies and processes. CMO’s and CIO’s have to form new partnerships. Content operations have to be initiated and properly staffed and funded. Various departments must be able to utilize interconnected big data. What will separate the winners from the losers? Well chief, that’s why I’m writing you. It’s in your hands. These initiatives won’t get the kind of priority and seriousness that inspire actual deadlines & action unless they come from your desk. You have to be the champion of customer centricity. You have to be our change agent. You have to be our innovator. Otherwise, it’s going to be business as usual, and that puts us in a very vulnerable place. Sincerely, Your Team @mikestilesPhoto: Gary Scott, stock.xchng

    Read the article

  • upgraded "Compiz' and "Unity", now no Unity 3D on screen

    - by user18432
    Today when I logged in to my Ubuntu 12.04, the update manager told me of some upgrades. Compiz and Unity were in those upgrades. After I installed the upgrades, I can no longer get the Unity panel on the left side of screen or the systray at the top of screen. I now have to run Ubuntu 12.04 with Unity 2D. My laptop is a HP Pavilion dv9000 with Nvidia GeForce Go 7600 video. I tried to run "unity --reset" but it says there are serious issues with compiz. I have cut & pasted the read out from the terminal below. [09:35:02] xxxxxxx@L01U1204:~$ unity --reset unity-panel-service: no process found Checking if settings need to be migrated ...no Checking if internal files need to be migrated ...no Backend : gconf Integration : true Profile : unity Adding plugins Initializing core options...done compiz (core) - Warn: failed to receive ConfigureNotify event on 0x2e00004 compiz (core) - Warn: failed to receive ConfigureNotify event on 0x580005a compiz (core) - Warn: failed to receive ConfigureNotify event on 0x3600006 compiz (core) - Warn: failed to receive ConfigureNotify event on 0x3200255 compiz (core) - Warn: failed to receive ConfigureNotify event on 0x1600002 compiz (core) - Warn: failed to receive ConfigureNotify event on 0x1400002 Initializing composite options...done Initializing opengl options...done Initializing decor options...done Initializing vpswitch options...done Initializing snap options...done Initializing mousepoll options...done Initializing resize options...done Initializing place options...done Initializing move options...done Initializing wall options...done Initializing grid options...done I/O warning : failed to load external entity "/home/brwright/.compiz/session/10afaca1703486b216133648409481824100000130110002" Initializing session options...done Initializing gnomecompat options...done Initializing animation options...done Initializing fade options...done Initializing unitymtgrabhandles options...done Initializing workarounds options...done Initializing scale options...done compiz (expo) - Warn: failed to bind image to texture Initializing expo options...done Initializing ezoom options...done compiz (core) - Error: Couldn't load plugin '/usr/lib/compiz/libunityshell.so' : /usr/lib/compiz/libunityshell.so: undefined symbol: _ZNK5unity4dash10Controller6windowEv compiz (core) - Error: Couldn't load plugin 'unityshell' compiz (core) - Warn: unhandled ConfigureNotify on 0x7000090! compiz (core) - Warn: this should never happen. you should probably file a bug about this. compiz (core) - Warn: unhandled ConfigureNotify on 0x7000093! compiz (core) - Warn: this should never happen. you should probably file a bug about this. compiz (core) - Warn: unhandled ConfigureNotify on 0x7000096! compiz (core) - Warn: this should never happen. you should probably file a bug about this. compiz (core) - Warn: unhandled ConfigureNotify on 0x7000099! compiz (core) - Warn: this should never happen. you should probably file a bug about this. compiz (core) - Warn: unhandled ConfigureNotify on 0x700009c! compiz (core) - Warn: this should never happen. you should probably file a bug about this. compiz (core) - Warn: unhandled ConfigureNotify on 0x700009f! compiz (core) - Warn: this should never happen. you should probably file a bug about this. Initializing annotate options...done Initializing blur options...done Initializing clone options...done Initializing colorfilter options...done Initializing commands options...done Initializing cube options...done Initializing imgjpeg options...done Initializing kdecompat options...done Initializing mag options...done Initializing neg options...done Initializing obs options...done Initializing opacify options...done Initializing put options...done Initializing resizeinfo options...done Initializing ring options...done Initializing rotate options...done Initializing scaleaddon options...done Initializing screenshot options...done Initializing shift options...done Initializing staticswitcher options...done Initializing switcher options...done Initializing thumbnail options...done Initializing unityshell options...done Initializing water options...done Initializing winrules options...done Initializing wobbly options...done Setting Update "main_menu_key" Setting Update "run_key" Starting gtk-window-decorator As you can see the terminal never comes back to the CI prompt. I must do a control C to get to the CI prompt, but then the OS is frozen. I have to reboot and run Unity 2D in able to do anything on my laptop. I hope I have explained this enough and provided some useful info. I am at a loss to understand what the problem is, or what exactly what is causing the problem. Is it Unity or Compiz? Can anyone help?

    Read the article

  • SQLAuthority News – Top 5 Latest Microsoft Certifications of 2013 – Guest Post

    - by Pinal Dave
    With the IT job market getting more and more competent by the day, certifications are a must for anyone who wishes to get a strong foothold in the industry. Microsoft community comes up with regular updates and enhancements in its existing products to keep up with the rapidly evolving requirements of the ICT industry. We bring you a list of five latest Microsoft certifications that you must consider acquiring this year. MCSE: SharePoint Learn all about Windows Server 2012 and Microsoft SharePoint 2013, which brings an advanced set of features to the fore in this latest version. It introduces new capabilities for business intelligence, social media, branding, search, identity management, mobile device among other features. Enjoy a great user experience with sharing and collaboration in community forum, within a pixel-perfect SharePoint website. Data connectivity and business intelligence tools allow users to process and access data, analyze reports, share and collaborate with each other more conveniently. Microsoft Specialist: Microsoft Project 2013 The only project management system that works seamlessly with other applications and cloud solutions of Microsoft, MS Project 2013 offers more than what meets the eye.  It provides for easier management and monitoring of projects so that users can ensure timely delivery while improving the productivity significantly. So keep all your projects on track and collaborate with your team like never before with this enhanced release! This one’s a must for all project managers. MCSE Messaging Another one of Microsoft gems is its messaging environment which has also launched the latest release Microsoft Exchange Server 2013. Messaging administrators can take up this training and validate their expertise in Unified Messaging, Exchange Online, PowerShell and Virtualization strategies, through MCSE Messaging certification in Exchange Server. If you wish to enhance productivity and data security of your organization while being flexible and extremely efficient, this is the right certification for you. MCSE Communication An enterprise can function optimally on the strength of its information flow and communication systems. With Lync Server 2013, you can introduce a whole new world of unified communications which consists of audio/video conferencing, dial-in, Persistent Chat, instant chat, and EDGE services in your organization. Utilize IT to serve and support business objectives by mastering this UC technology with this latest MCSE Communication course on using Microsoft Lync Server 2013. MCSE: SQL Server 2012 BI Platform The decision making process is largely influenced by underlying enterprise information used by the management for business intelligence. Therefore, a robust business intelligence platform that anchors enterprise IT and transform it to operational efficiencies is the need of the hour. SQL Server 2012 BI Platform certification helps professionals implement, manage and maintain a BI database infrastructure effectively. IT professionals with BI skills are highly sought after these days. MCSD: Windows Store Apps A Microsoft Certified Solutions Developer certification in Windows Store Apps validates your potential in designing interactive apps. Learn The Essentials of Developing Windows Store Apps using HTML5 and JavaScript and establish yourself as an ace developer capable of creating fast and fluid Metro style apps for Windows 8 that are accessible on a variety of devices. You can also go ahead and Learn Essentials of Developing Windows Store Apps using C# mode if you’re already familiar and working with C# programming language. Hence the developers are free to choose their own favorite development stream which opens doors for them to get ready for the latest and exciting application development platform called Windows store apps. Software developers with these skills are in great demand in the industry today. In order to continue being competitive in your respective fields, it is imperative that IT personnel update their knowledge on a regular basis. Certifications are a means to achieve this goal. Not considered to be an optional pre-requisite anymore, major IT certifications such as these are now essential to stay afloat in a cut-throat industry where technologies change on a daily basis. This blog is written by Aruneet Anand of Koenig Solutions. Koenig Solutions does training for all of the above courses. For more information, visit the website. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology Tagged: Microsoft Certifications

    Read the article

  • Fast Data: Go Big. Go Fast.

    - by Dain C. Hansen
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 For those of you who may have missed it, today’s second full day of Oracle OpenWorld 2012 started with a rumpus. Joe Tucci, from EMC outlined the human face of big data with real examples of how big data is transforming our world. And no not the usual tried-and-true weblog examples, but real stories about taxi cab drivers in Singapore using big data to better optimize their routes as well as folks just trying to get a better hair cut. Next we heard from Thomas Kurian who talked at length about the important platform characteristics of Oracle’s Cloud and more specifically Oracle’s expanded Cloud Services portfolio. Especially interesting to our integration customers are the messaging support for Oracle’s Cloud applications. What this means is that now Oracle’s Cloud applications have a lightweight integration fabric that on-premise applications can communicate to it via REST-APIs using Oracle SOA Suite. It’s an important element to our strategy at Oracle that supports this idea that whether your requirements are for private or public, Oracle has a solution in the Cloud for all of your applications and we give you more deployment choice than any vendor. If this wasn’t enough to get the juices flowing, later that morning we heard from Hasan Rizvi who outlined in his Fusion Middleware session the four most important enterprise imperatives: Social, Mobile, Cloud, and a brand new one: Fast Data. Today, Rizvi made an important step in the definition of this term to explain that he believes it’s a convergence of four essential technology elements: Event Processing for event filtering, business rules – with Oracle Event Processing Data Transformation and Loading - with Oracle Data Integrator Real-time replication and integration – with Oracle GoldenGate Analytics and data discovery – with Oracle Business Intelligence Each of these four elements can be considered (and architect-ed) together on a single integrated platform that can help customers integrate any type of data (structured, semi-structured) leveraging new styles of big data technologies (MapReduce, HDFS, Hive, NoSQL) to process more volume and variety of data at a faster velocity with greater results.  Fast data processing (and especially real-time) has always been our credo at Oracle with each one of these products in Fusion Middleware. For example, Oracle GoldenGate continues to be made even faster with the recent 11g R2 Release of Oracle GoldenGate which gives us some even greater optimization to Oracle Database with Integrated Capture, as well as some new heterogeneity capabilities. With Oracle Data Integrator with Big Data Connectors, we’re seeing much improved performance by running MapReduce transformations natively on Hadoop systems. And with Oracle Event Processing we’re seeing some remarkable performance with customers like NTT Docomo. Check out their upcoming session at Oracle OpenWorld on Wednesday to hear more how this customer is using Event processing and Big Data together. If you missed any of these sessions and keynotes, not to worry. There's on-demand versions available on the Oracle OpenWorld website. You can also checkout our upcoming webcast where we will outline some of these new breakthroughs in Data Integration technologies for Big Data, Cloud, and Real-time in more details. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";}

    Read the article

  • SQLAuthority News – Follow up on – Replace a Column Name in Multiple Stored Procedure all together

    - by pinaldave
    Last month I had a fantastic time with lots of puzzles and brain teasers, the amount of participation which I have received on the blog is indeed inspiring to write more. One of the blog post was about how to replace a column name in all the stored procedures. The article had very interesting conversation as a follow up. Please read the original article Replace a Column Name in Multiple Stored Procedure all together before reading this blog further as they are connected. Let us start few of the interesting comments. SQL Server Expert Imran Mohammed had a wonderful first and excellent note. I suggest all of you to read it. Imran stresses on the Data Modelling and Logical as well as Physical Design. Developers must create a logical design and get approval on naming convention, data types, references, constraints, indexes etc. He further suggested that one should not cut steps but must follow all the industry standards and guidelines. Here extended my blog post with following note – “Extending Pinal’s answer, what you can do is go to database properties, all tasks, scripts objects, in scripting wizard select all the stored procedure for which you want to change column name, export the query to a new window and then do find and replace, all in once window and execute the script. But make sure you check what you are replacing, sometimes column names are also used in table names, for ex:Table Name: Product and Column Name: ProductId, ProductName”. Thanks Imran Great Points!  Gatej Alexandru suggested that it is not good idea to DROP or CREATE but rather use ALTER as quite possible there may be permissions issue as well. Very good point let me see if I can write blog post over it. Vinay Kumar and SQLStudent144 have proposed another method to achieve the same. I am combining their solution and writing them here. Step 1. Press Ctrl+T or change “Result to Text” mode. Step 2. Execute below commands.SELECT 'EXEC sp_helptext [' + referencing_schema_name + '.' + referencing_entity_name + ']' FROM sys.dm_sql_referencing_entities('schema.objectname','OBJECT') Where schema.objectname is the object or table you are searching for. Step 3. Now copy the result and paste in new window. Again Press Ctrl+T or change “Result to Text” mode. Step 4. Copy the result and paste in new window. Execute the query. Step 5. Copy the result and paste in new window. Step 6. Now find your searching text in the script, make necessary changes and execute this script. Do not forget to remove the code which is generated in resultset which are not relevant to the T-SQL Script. Digitqr suggest we can do this for other objects besides Stored Procedure as well. Iosif suggests to use tool SQL Search from RedGate. I guess this sums it well. We have an alternative perspective to our original issue of replacing the column name in multiple stored procedure. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • How to develop RPG Damage Formulas?

    - by user127817
    I'm developing a classical 2d RPG (in a similar vein to final fantasy) and I was wondering if anyone had some advice on how to do damage formulas/links to resources/examples? I'll explain my current setup. Hopefully I'm not overdoing it with this question, and I apologize if my questions is too large/broad My Characters stats are composed of the following: enum Stat { HP = 0, MP = 1, SP = 2, Strength = 3, Vitality = 4, Magic = 5, Spirit = 6, Skill = 7, Speed = 8, //Speed/Agility are the same thing Agility = 8, Evasion = 9, MgEvasion = 10, Accuracy = 11, Luck = 12, }; Vitality is basically defense to physical attacks and spirit is defense to magic attacks. All stats have fixed maximums (9999 for HP, 999 for MP/SP and 255 for the rest). With abilities, the maximums can be increased (99999 for HP, 9999 for HP/SP, 999 for the rest) with typical values (at level 100) before/after abilities+equipment+etc will be 8000/20,000 for HP, 800/2000 for SP/MP, 180/350 for other stats Late game Enemy HP will typically be in the lower millions (with a super boss having the maximum of ~12 million). I was wondering how do people actually develop proper damage formulas that scale correctly? For instance, based on this data, using the damage formulas for Final Fantasy X as a base looked very promising. A full reference here http://www.gamefaqs.com/ps2/197344-final-fantasy-x/faqs/31381 but as a quick example: Str = 127, 'Attack' command used, enemy Def = 34. 1. Physical Damage Calculation: Step 1 ------------------------------------- [{(Stat^3 ÷ 32) + 32} x DmCon ÷16] Step 2 ---------------------------------------- [{(127^3 ÷ 32) + 32} x 16 ÷ 16] Step 3 -------------------------------------- [{(2048383 ÷ 32) + 32} x 16 ÷ 16] Step 4 --------------------------------------------------- [{(64011) + 32} x 1] Step 5 -------------------------------------------------------- [{(64043 x 1)}] Step 6 ---------------------------------------------------- Base Damage = 64043 Step 7 ----------------------------------------- [{(Def - 280.4)^2} ÷ 110] + 16 Step 8 ------------------------------------------ [{(34 - 280.4)^2} ÷ 110] + 16 Step 9 ------------------------------------------------- [(-246)^2) ÷ 110] + 16 Step 10 ---------------------------------------------------- [60516 ÷ 110] + 16 Step 11 ------------------------------------------------------------ [550] + 16 Step 12 ---------------------------------------------------------- DefNum = 566 Step 13 ---------------------------------------------- [BaseDmg * DefNum ÷ 730] Step 14 --------------------------------------------------- [64043 * 566 ÷ 730] Step 15 ------------------------------------------------------ [36248338 ÷ 730] Step 16 ------------------------------------------------- Base Damage 2 = 49655 Step 17 ------------ Base Damage 2 * {730 - (Def * 51 - Def^2 ÷ 11) ÷ 10} ÷ 730 Step 18 ---------------------- 49655 * {730 - (34 * 51 - 34^2 ÷ 11) ÷ 10} ÷ 730 Step 19 ------------------------- 49655 * {730 - (1734 - 1156 ÷ 11) ÷ 10} ÷ 730 Step 20 ------------------------------- 49655 * {730 - (1734 - 105) ÷ 10} ÷ 730 Step 21 ------------------------------------- 49655 * {730 - (1629) ÷ 10} ÷ 730 Step 22 --------------------------------------------- 49655 * {730 - 162} ÷ 730 Step 23 ----------------------------------------------------- 49655 * 568 ÷ 730 Step 24 -------------------------------------------------- Final Damage = 38635 I simply modified the dividers to include the attack rating of weapons and the armor rating of armor. Magic Damage is calculated as follows: Mag = 255, Ultima is used, enemy MDef = 1 Step 1 ----------------------------------- [DmCon * ([Stat^2 ÷ 6] + DmCon) ÷ 4] Step 2 ------------------------------------------ [70 * ([255^2 ÷ 6] + 70) ÷ 4] Step 3 ------------------------------------------ [70 * ([65025 ÷ 6] + 70) ÷ 4] Step 4 ------------------------------------------------ [70 * (10837 + 70) ÷ 4] Step 5 ----------------------------------------------------- [70 * (10907) ÷ 4] Step 6 ------------------------------------ Base Damage = 190872 [cut to 99999] Step 7 ---------------------------------------- [{(MDef - 280.4)^2} ÷ 110] + 16 Step 8 ------------------------------------------- [{(1 - 280.4)^2} ÷ 110] + 16 Step 9 ---------------------------------------------- [{(-279.4)^2} ÷ 110] + 16 Step 10 -------------------------------------------------- [(78064) ÷ 110] + 16 Step 11 ------------------------------------------------------------ [709] + 16 Step 12 --------------------------------------------------------- MDefNum = 725 Step 13 --------------------------------------------- [BaseDmg * MDefNum ÷ 730] Step 14 --------------------------------------------------- [99999 * 725 ÷ 730] Step 15 ------------------------------------------------- Base Damage 2 = 99314 Step 16 ---------- Base Damage 2 * {730 - (MDef * 51 - MDef^2 ÷ 11) ÷ 10} ÷ 730 Step 17 ------------------------ 99314 * {730 - (1 * 51 - 1^2 ÷ 11) ÷ 10} ÷ 730 Step 18 ------------------------------ 99314 * {730 - (51 - 1 ÷ 11) ÷ 10} ÷ 730 Step 19 --------------------------------------- 99314 * {730 - (49) ÷ 10} ÷ 730 Step 20 ----------------------------------------------------- 99314 * 725 ÷ 730 Step 21 -------------------------------------------------- Final Damage = 98633 The problem is that the formulas completely fall apart once stats start going above 255. In particular Defense values over 300 or so start generating really strange behavior. High Strength + Defense stats lead to massive negative values for instance. While I might be able to modify the formulas to work correctly for my use case, it'd probably be easier just to use a completely new formula. How do people actually develop damage formulas? I was considering opening excel and trying to build the formula that way (mapping Attack Stats vs. Defense Stats for instance) but I was wondering if there's an easier way? While I can't convey the full game mechanics of my game here, might someone be able to suggest a good starting place for building a damage formula? Thanks

    Read the article

  • Vitality of Product Information Management Showcased at OpenWorld 2012

    - by Mala Narasimharajan
     By Sachin Patel Can you hear the countdown clock ticking!! OpenWorld 2012 is almost here and as I write this Oracle is buzzing with fresh new ideas and solutions that will be showcased this year. What an exciting time for all of us to be in midst of a digital revolution. Whether it is Apple fans clamoring to find every new feature that has been added to the iPhone 5 or a startup launching a new digital thermostat (has anyone looked at the new one from Nest ), product information is a vital for companies to grow and compete in this cut-throat market. Customer today struggle to aggregate and enrich this product data from the myriad of systems they have in place to run their businesses and operations. Having a product information strategy is paramount to align your sales channels and operations with the most accurate and upto date product data. We have a number of sessions this year at OpenWorld where you can gain more insight into how Oracle’s next generation of Fusion Applications, in this case Fusion Product Hub can provide you with a solution to streamline and get control of your Product Master Data. Enabling Trusted Enterprise Product Data with Oracle Fusion Product HubTuesday, October 2nd 11:45 am, Moscone West 2022 Join me Sachin Patel, Director of Product Strategy and Milan Bhatia, VP of Development as we discuss how you can enable trusted product master data in your enterprise. In this session we plan to cover the challenges companies face today in mastering product data. The discussion will also include how Fusion Product Hub brings new and innovative features to empower your product data owners to create a holistic and rich product definition that can be leveraged across your enterprise. We will also be joined by Pawel Fidelus from Fideltronik an Early Adopter for Fusion Product Hub who will showcase their plans to implement Fusion Product Hub and the value it will bring to Fideltronik Multichannel Fulfillment Excellence in Direct-to-Consumer Market Thursday, October 4th, 12:45 am, Moscone West 2024 Do you have multiple order capture systems? Do you have difficulty in fulfilling orders for your customers across various channels and suppliers? Mark Carson, Director, Fusion DOO and Brad Kerr, Director, AGSS will be showcasing the Fusion Distributed Order Orchestration solution and how companies can orchestrate orders from multiple order capture systems and route them to the appropriate fulfillment system. Sachin Patel, Director Product Strategy for Product MDM will highlight the business pain points in consolidating and commercializing data from a Multi Channel Commerce point of view and how Fusion Product Hub helps in allowing you to provide a single source of truth to drive a singular and rich customer experience. Oracle Fusion Supply Chain Management: Customer Adoption and Experiences                                                Wednesday, October 3rd 10:15 am, Moscone West 2003 This is a great session to attend to learn about how Fusion Supply Chain Management and Fusion Product Hub Early Adopters, including Boeing and Fideltronik are leveraging Fusion Applications to improve their Supply Chain operations. Have a great OpenWorld and see you soon!!

    Read the article

  • Entity Framework 4, WCF &amp; Lazy Loading Tip

    - by Dane Morgridge
    If you are doing any work with Entity Framework and custom WCF services in EFv1, everything works great.  As soon as you jump to EFv4, you may find yourself getting odd errors that you can’t seem to catch.  The problem is almost always has something to do with the new lazy loading feature in Entity Framework 4.  With Entity Framework 1, you didn’t have lazy loading so this problem didn’t surface.  Assume I have a Person entity and an Address entity where there is a one-to-many relationship between Person and Address (Person has many Addresses). In Entity Framework 1 (or in EFv4 with lazy loading turned off), I would have to load the Address data by hand by either using the Include or Load Method: var people = context.People.Include("Addresses"); or people.Addresses.Load(); Lazy loading works when the first time the Person.Addresses collection is accessed: 1: var people = context.People.ToList(); 2:  3: // only person data is currently in memory 4:  5: foreach(var person in people) 6: { 7: // EF determines that no Address data has been loaded and lazy loads 8: int count = person.Addresses.Count(); 9: } 10:  Lazy loading has the useful (and sometimes not useful) feature of fetching data when requested.  It can make your life easier or it can make it a big pain.  So what does this have to do with WCF?  One word: Serialization. When you need to pass data over the wire with WCF, the data contract is serialized into either XML or binary depending on the binding you are using.  Well, if I am using lazy loading, the Person entity gets serialized and during that process, the Addresses collection is accessed.  When that happens, the Address data is lazy loaded.  Then the Address is serialized, and the Person property is accessed, and then also serialized and then the Addresses collection is accessed.  Now the second time through, lazy loading doesn’t kick in, but you can see the infinite loop caused by this process.  This is a problem with any serialization, but I personally found it trying to use WCF. The fix for this is to simply turn off lazy Loading.  This can be done at each call by using context options: context.ContextOptions.LazyLoadingEnabled = false; Turning lazy loading off will now allow your classes to be serialized properly.  Note, this is if you are using the standard Entity Framework classes.  If you are using POCO,  you will have to do something slightly different.  With POCO, the Entity Framework will create proxy classes by default that allow things like lazy loading to work with POCO.  This proxy basically creates a proxy object that is a full Entity Framework object that sits between the context and the POCO object.  When using POCO with WCF (or any serialization) just turning off lazy loading doesn’t cut it.  You have to turn off the proxy creation to ensure that your classes will serialize properly: context.ContextOptions.ProxyCreationEnabled = false; The nice thing is that you can do this on a call-by-call basis.  If you use a new context for each set of operations (which you should) then you can turn either lazy loading or proxy creation on and off as needed.

    Read the article

  • Get to Know a Candidate (9 of 25): Gary Johnson&ndash;Libertarian Party

    - by Brian Lanham
    DISCLAIMER: This is not a post about “Romney” or “Obama”. This is not a post for whom I am voting. Information sourced for Wikipedia. Johnson served as the 29th Governor of New Mexico from 1995 to 2003, as a member of the Republican Party, and is known for his low-tax libertarian views and his strong emphasis on personal health and fitness. While a student at the University of New Mexico in 1974, Johnson sustained himself financially by working as a door-to-door handyman. In 1976 he founded Big J Enterprises, which grew from this one-person venture to become one of New Mexico's largest construction companies. He entered politics for the first time by running for Governor of New Mexico in 1994 on a fiscally conservative, low-tax, anti-crime platform. Johnson won the Republican Party of New Mexico's gubernatorial nomination, and defeated incumbent Democratic governor Bruce King by 50% to 40%. He cut the 10% annual growth in the budget: in part, due to his use of the gubernatorial veto 200 times during his first six months in office, which gained him the nickname "Governor Veto". Johnson sought re-election in 1998, winning by 55% to 45%. In his second term, he concentrated on the issue of school voucher reforms, as well as campaigning for marijuana decriminalization and opposition to the War on Drugs. During his tenure as governor, Johnson adhered to a stringent anti-tax and anti-bureaucracy policy driven by a cost–benefit analysis rationale, setting state and national records for his use of veto powers: more than the other 49 contemporary governors put together. Term-limited, Johnson could not run for re-election at the end of his second term. As a fitness enthusiast, Johnson has taken part in several Ironman Triathlons, and he climbed Mount Everest in May 2003. After leaving office, Johnson founded the non-profit Our America Initiative in 2009, a political advocacy committee seeking to promote policies such as free enterprise, foreign non-interventionism, limited government and privatization. The Libertarian Party is the third largest political party in the United States. It is also identified by many as the fastest growing political party in the United States. The political platform of the Libertarian Party reflects the ideas of libertarianism, favoring minimally regulated markets, a less powerful state, strong civil liberties (including support for Same-sex marriage and other LGBT rights), cannabis legalization and regulation, separation of church and state, open immigration, non-interventionism and neutrality in diplomatic relations (i.e., avoiding foreign military or economic entanglements with other nations), freedom of trade and travel to all foreign countries, and a more responsive and direct democracy. Members of the Libertarian Party have also supported the repeal of NAFTA, CAFTA, and similar trade agreements, as well as the United States' exit from the United Nations, WTO, and NATO. Although there is not an officially labeled political position of the party, it is considered by many to be more right-wing than the Democratic Party but more left-wing than the Republican Party when comparing the parties' positions to each other, placing it at or above the center. In the 30 states where voters can register by party, there are over 282,000 voters registered as Libertarians. Hundreds of Libertarian candidates have been elected or appointed to public office, and thousands have run for office under the Libertarian banner. The Libertarian Party has many firsts in its credit, such as being the first party to get an electoral vote for a woman in a United States presidential election. Learn more about Gary Johnson and Libertarian Party on Wikipedia.

    Read the article

  • HSSFS Part 2.1 - Parsing @@VERSION

    - by Most Valuable Yak (Rob Volk)
    For Part 2 of the Handy SQL Server Function Series I decided to tackle parsing useful information from the @@VERSION function, because I am an idiot.  It turns out I was confused about CHARINDEX() vs. PATINDEX() and it pretty much invalidated my original solution.  All is not lost though, this mistake turned out to be informative for me, and hopefully for you. Referring back to the "Version" view in the prelude I started with the following query to extract the version number: SELECT DISTINCT SQLVersion, SUBSTRING(VersionString,PATINDEX('%-%',VersionString)+2, 12) VerNum FROM VERSION I used PATINDEX() to find the first hyphen "-" character in the string, since the version number appears 2 positions after it, and got these results: SQLVersion VerNum ----------- ------------ 2000 8.00.2055 (I 2005 9.00.3080.00 2005 9.00.4053.00 2008 10.50.1600.1 As you can see it was good enough for most of the values, but not for the SQL 2000 @@VERSION.  You'll notice it has only 3 version sections/octets where the others have 4, and the SUBSTRING() grabbed the non-numeric characters after.  To properly parse the version number will require a non-fixed value for the 3rd parameter of SUBSTRING(), which is the number of characters to extract. The best value is the position of the first space to occur after the version number (VN), the trick is to figure out how to find it.  Here's where my confusion about PATINDEX() came about.  The CHARINDEX() function has a handy optional 3rd parameter: CHARINDEX (expression1 ,expression2 [ ,start_location ] ) While PATINDEX(): PATINDEX ('%pattern%',expression ) Does not.  I had expected to use PATINDEX() to start searching for a space AFTER the position of the VN, but it doesn't work that way.  Since there are plenty of spaces before the VN, I thought I'd try PATINDEX() on another character that doesn't appear before, and tried "(": SELECT SQLVersion, SUBSTRING(VersionString,PATINDEX('%-%',VersionString)+2, PATINDEX('%(%',VersionString)) FROM VERSION Unfortunately this messes up the length calculation and yields: SQLVersion VerNum ----------- --------------------------- 2000 8.00.2055 (Intel X86) Dec 16 2008 19:4 2005 9.00.3080.00 (Intel X86) Sep 6 2009 01: 2005 9.00.4053.00 (Intel X86) May 26 2009 14: 2008 10.50.1600.1 (Intel X86) Apr 2008 10.50.1600.1 (X64) Apr 2 20 Yuck.  The problem is that PATINDEX() returns position, and SUBSTRING() needs length, so I have to subtract the VN starting position: SELECT SQLVersion, SUBSTRING(VersionString,PATINDEX('%-%',VersionString)+2, PATINDEX('%(%',VersionString)-PATINDEX('%-%',VersionString)) VerNum FROM VERSION And the results are: SQLVersion VerNum ----------- -------------------------------------------------------- 2000 8.00.2055 (I 2005 9.00.4053.00 (I Msg 537, Level 16, State 2, Line 1 Invalid length parameter passed to the LEFT or SUBSTRING function. Ummmm, whoops.  Turns out SQL Server 2008 R2 includes "(RTM)" before the VN, and that causes the length to turn negative. So now that that blew up, I started to think about matching digit and dot (.) patterns.  Sadly, a quick look at the first set of results will quickly scuttle that idea, since different versions have different digit patterns and lengths. At this point (which took far longer than I wanted) I decided to cut my losses and redo the query using CHARINDEX(), which I'll cover in Part 2.2.  So to do a little post-mortem on this technique: PATINDEX() doesn't have the flexibility to match the digit pattern of the version number; PATINDEX() doesn't have a "start" parameter like CHARINDEX(), that allows us to skip over parts of the string; The SUBSTRING() expression is getting pretty complicated for this relatively simple task! This doesn't mean that PATINDEX() isn't useful, it's just not a good fit for this particular problem.  I'll include a version in the next post that extracts the version number properly. UPDATE: Sorry if you saw the unformatted version of this earlier, I'm on a quest to find blog software that ACTUALLY WORKS.

    Read the article

  • The enterprise vendor con - connecting SSD's using SATA 2 (3Gbits) thus limiting there performance

    - by tonyrogerson
    When comparing SSD against Hard drive performance it really makes me cross when folk think comparing an array of SSD running on 3GBits/sec to hard drives running on 6GBits/second is somehow valid. In a paper from DELL (http://www.dell.com/downloads/global/products/pvaul/en/PowerEdge-PowerVaultH800-CacheCade-final.pdf) on increasing database performance using the DELL PERC H800 with Solid State Drives they compare four SSD drives connected at 3Gbits/sec against ten 10Krpm drives connected at 6Gbits [Tony slaps forehead while shouting DOH!]. It is true in the case of hard drives it probably doesn’t make much difference 3Gbit or 6Gbit because SAS and SATA are both end to end protocols rather than shared bus architecture like SCSI, so the hard drive doesn’t share bandwidth and probably can’t get near the 600MiBytes/second throughput that 6Gbit gives unless you are doing contiguous reads, in my own tests on a single 15Krpm SAS disk using IOMeter (8 worker threads, queue depth of 16 with a stripe size of 64KiB, an 8KiB transfer size on a drive formatted with an allocation size of 8KiB for a 100% sequential read test) I only get 347MiBytes per second sustained throughput at an average latency of 2.87ms per IO equating to 44.5K IOps, ok, if that was 3GBits it would be less – around 280MiBytes per second, oh, but wait a minute [...fingers tap desk] You’ll struggle to find in the commodity space an SSD that doesn’t have the SATA 3 (6GBits) interface, SSD’s are fast not only low latency and high IOps but they also offer a very large sustained transfer rate, consider the OCZ Agility 3 it so happens that in my masters dissertation I did the same test but on a difference box, I got 374MiBytes per second at an average latency of 2.67ms per IO equating to 47.9K IOps – cost of an 240GB Agility 3 is £174.24 (http://www.scan.co.uk/products/240gb-ocz-agility-3-ssd-25-sata-6gb-s-sandforce-2281-read-525mb-s-write-500mb-s-85k-iops), but that same drive set in a box connected with SATA 2 (3Gbits) would only yield around 280MiBytes per second thus losing almost 100MiBytes per second throughput and a ton of IOps too. So why the hell are “enterprise” vendors still only connecting SSD’s at 3GBits? Well, my conspiracy states that they have no interest in you moving to SSD because they’ll lose so much money, the argument that they use SATA 2 doesn’t wash, SATA 3 has been out for some time now and all the commodity stuff you buy uses it now. Consider the cost, not in terms of price per GB but price per IOps, SSD absolutely thrash Hard Drives on that, it was true that the opposite was also true that Hard Drives thrashed SSD’s on price per GB, but is that true now, I’m not so sure – a 300GByte 2.5” 15Krpm SAS drive costs £329.76 ex VAT (http://www.scan.co.uk/products/300gb-seagate-st9300653ss-savvio-15k3-25-hdd-sas-6gb-s-15000rpm-64mb-cache-27ms) which equates to £1.09 per GB compared to a 480GB OCZ Agility 3 costing £422.10 ex VAT (http://www.scan.co.uk/products/480gb-ocz-agility-3-ssd-25-sata-6gb-s-sandforce-2281-read-525mb-s-write-410mb-s-30k-iops) which equates to £0.88 per GB. Ok, I compared an “enterprise” hard drive with a “commodity” SSD, ok, so things get a little more complicated here, most “enterprise” SSD’s are SLC and most commodity are MLC, SLC gives more performance and wear, I’ll talk about that another day. For now though, don’t get sucked in by vendor marketing, SATA 2 (3Gbit) just doesn’t cut it, SSD need 6Gbit to breath and even that SSD’s are pushing. Alas, SSD’s are connected using SATA so all the controllers I’ve seen thus far from HP and DELL only do SATA 2 – deliberate? Well, I’ll let you decide on that one.

    Read the article

  • WebCenter Customer Spotlight: Texas Industries, Inc.

    - by me
    Author: Peter Reiser - Social Business Evangelist, Oracle WebCenter  Solution SummaryTexas Industries, Inc. (TXI) is a leading supplier of cement, aggregate, and consumer product building materials for residential, commercial, and public works projects. TXI is based in Dallas and employs around 2,000 employees. The customer had the challenge of decentralized and manual processes for entering 180,000 vendor invoices annually.  Invoice entry was a time- and resource-intensive process that entailed significant personnel requirements. TXI implemented a centralized solution leveraging Oracle WebCenter Imaging, a smart routing solution that enables users to capture invoices electronically with Oracle WebCenter Capture and Oracle WebCenter Forms Recognition to send  the invoices through to Oracle Financials for approvals and processing.  TXI significantly lowered resource needs for payable processing,  increase productivity by 80% and reduce invoice processing cycle times by 84%—from 20 to 30 days to just 3 to 5 days, on average. Company OverviewTexas Industries, Inc. (TXI) is a leading supplier of cement, aggregate, and consumer product building materials for residential, commercial, and public works projects. With operating subsidiaries in six states, TXI is the largest producer of cement in Texas and a major producer in California. TXI is a major supplier of stone, sand, gravel, and expanded shale and clay products, and one of the largest producers of bagged cement and concrete  products in the Southwest. Business ChallengesTXI had the challenge of decentralized and manual processes for entering 180,000 vendor invoices annually.  Invoice entry was a time- and resource-intensive process that entailed significant personnel requirements. Their business objectives were: Increase the efficiency of core business processes, such as invoice processing, to support the organization’s desire to maintain its role as the Southwest’s leader in delivering high-quality, low-cost products to the construction industry Meet the audit and regulatory requirements for achieving Sarbanes-Oxley (SOX) compliance Streamline entry of 180,000 invoices annually to accelerate processing, reduce errors, cut invoice storage and routing costs, and increase visibility into payables liabilities Solution DeployedTXI replaced a resource-intensive, paper-based, decentralized process for invoice entry with a centralized solution leveraging Oracle WebCenter Imaging 11g. They worked with the Oracle Partner Keste LLC to develop a smart routing solution that enables users to capture invoices electronically with Oracle WebCenter Capture and then uses Oracle WebCenter Forms Recognition and the Oracle WebCenter Imaging workflow to send the invoices through to Oracle Financials for approvals and processing. Business Results Significantly lowered resource needs for payable processing through centralization and improved efficiency  Enabled the company to process invoices faster and pay bills earlier, allowing it to take advantage of additional vendor discounts Tracked to increase productivity by 80% and reduce invoice processing cycle times by 84%—from 20 to 30 days to just 3 to 5 days, on average Achieved a 25% reduction in paper invoice storage costs now that invoices are captured digitally, and enabled a 50% reduction in shipping costs, as the company no longer has to send paper invoices between headquarters and production facilities for approvals “Entering and manually processing more than 180,000 vendor invoices annually was time and labor intensive. With Oracle Imaging and Process Management, we have automated and centralized invoice entry and processing at our corporate office, improving productivity by 80% and reducing invoice processing cycle times by 84%—a very important efficiency gain.” Terry Marshall, Vice President of Information Services, Texas Industries, Inc. Additional Information TXI Customer Snapshot Oracle WebCenter Content Oracle WebCenter Capture Oracle WebCenter Forms Recognition

    Read the article

< Previous Page | 64 65 66 67 68 69 70 71 72 73 74 75  | Next Page >