Search Results

Search found 12224 results on 489 pages for 'map editor'.

Page 400/489 | < Previous Page | 396 397 398 399 400 401 402 403 404 405 406 407  | Next Page >

  • Play Thousands of Online Radio Stations with Shoutcast in VLC

    - by DigitalGeekery
    Are you looking for more variety from your radio stations? Today we’ll take a look at how to easily stream thousands of radio stations to your desktop with VLC media player. Getting Started Select Media from the menu, go to Services Discovery, and click Shoutcast radio listings.   Next, select View from the menu and click Playlist.   Or, click on the Show Playlist button In the Playlist window, click on Shoutcast radio listings in the left pane. You should then see a very long list of Titles displayed on the right. Scroll though the list to find a music genre or topic that interests you. Double-click to expand the list of station options. Select one of the channel listings from the list and double click to begin playing.   Looking for a specific station? Type a search term into the search filter box to see if it is available.   That’s it. Sit back and enjoy listening to your favorite Internet radio programming. If you are a music or talk radio fan, you aren’t likely to run out of listening options in VLC. Want to find some more uses for VLC? Check out our articles on how to copy a DVD, convert video files to MP3, and how to set a video as your desktop wallpaper. Download VLC Similar Articles Productive Geek Tips Listen to Over 100,000 Radio Stations in Windows Media CenterListen to Local FM Radio in Windows 7 Media CenterListen and Record Over 12,000 Online Radio Stations with RadioSureGeek Reviews: Play And Record Internet Radio With Screamer RadioWeekend Fun: Watch Television on Your PC with AnyTV TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 Zoofs, find the most popular tweeted YouTube videos Video preview of new Windows Live Essentials 21 Cursor Packs for XP, Vista & 7 Map the Stars with Stellarium Use ILovePDF To Split and Merge PDF Files TimeToMeet is a Simple Online Meeting Planning Tool

    Read the article

  • Making the WPeFfort

    - by Laila
    Microsoft Visual Studio 2010 will be launched on April 12th. The basic layout looks pretty much as it did, so it is not immediately obvious on first inspection that it was completely rewritten in the Windows Presentation Foundation (WPF). The current VS 2008 codebase had reached the end of its life; It was getting slow to initialize and sluggish to run, and was never going to allow for multi-monitor support or easier extensibility. It can't have been an easy decision to rewrite Visual Studio, but the gamble seems to have paid off. Although certain bugs in the betas caused some anxiety about performance, these seem to have been fixed, and the new Visual Studio is definitely faster. In rewriting the codebase, it has been possible to make obvious improvements, such as being able to run different windows on different monitors, and you only being presented with the Toolbox controls and References that are appropriate to your target .NET version. There is also an IntelliTrace debugger, and Intellisense has been improved by virtue of separating a 'Suggestion Mode' and 'Completion Mode' (with its 'Generate From.' 'Highlight References.', and 'Navigate to...' features). At the same time, there has been quite a clearout; Certain features that had been tucked away in the previous versions, such as Brief or Emacs emulation support, have been dropped. (Yes, they were being used!) There are a lot of features that didn't require the rewrite, but are welcome. It is now easier to develop WPF applications (e.g. drag-and-drop Databinding), and there is support for Azure. There are more, and better templates and the design tools are greatly improved (e.g. Expression Web, Expression Blend, WPF Sketchflow, Silverlight designer, Document Map Margin and Inline Call Hierarchy). Sharepoint is better supported, and Office apps will benefit from C#'s support of optional and named arguments, and allowing several Office Solutions within a Deployment package. Most importantly, it is a vote of confidence in the WPF. VS 2010 is the essential missing component that has been impeding the faster adoption of WPF. The fact that it is actually now written in WPF should now reassure the doubters, and convince more developers to make the move from WinForms to WPF. In using WPF, the developers of Visual Studio have had the clout to fix some issues which have been bothering WPF developers for some time (such as blurred text). Do you see a brighter future as a result of transferring from WinForms to WPF? I'd love to know what you think. Cheers, Laila

    Read the article

  • Use of Business Parameters in BPM12c

    - by Abhishek Mittal-Oracle
    With the release of BPM12c, a new feature to use Business Parameters is introduced through which we can define a business parameter which will behave as a global variable which can be used within BPM project. Business Administrator can be the one responsible to modify the business parameters value dynamically at run-time which may bring change in BPM process flow where it is used.This feature was a part of BPM10g product and was extensively used. In BPM11g, this feature is not present currently.Business Parameters can be defined in 2 ways:1. Using Jdev to define business parameters, and 2. Using BPM workspace to define business parameters.It is important to note that business parameters need to be mapped with a valid organisation unit defined in a BPM project. If the same is not handled, exceptions like 'BPM-70702' will be thrown by BPM Engine. This is because business parameters work along with organisation defined in a BPM project.At the same time, we can use same business parameter across different organisation units with different values. Business Parameters in BPM12c has this capability to handle multiple values with different organisation units defined in a single BPM project. This enables business to re-use same business parameters defined in a BPM project across different organisations.Business parameters can be defined using the below data types:1. int2. string 3. boolean4. double While defining an business parameter, it is mandatory to provide a default value. Below are the steps to define a business parameter in Jdev: Step 1:  Open 'Organization' and click on 'Business Parameters' tab.Step 2:  Click on '+' button.Step 3: Add business parameter name, type and provide default value(mandatory).Step 4: Click on 'OK' button.Step 5: Business parameter is defined. Below are the steps to define a business parameter in BPM workspace: Step 1: Login to BPM workspace using admin-username and password.Step 2: Click on 'Administration' on the right top side of workspace.Step 3: Click on 'Business Parameters' in the left navigation panel under 'Organization'. Step 4:  Click on '+' button.Step 5: Add business parameter name, type and provide default value(mandatory).Step 6: Click on 'OK' button.Step 7: Business parameter is defined. Note: As told earlier in the blog, it is necessary to define and map a valid organization ID with predefined variable 'organizationalUnit' under data associations in an BPM process before the business parameter is used. I have created one sample PoC demonstrating the use of Business Parameters in BPM12c and it can be found here.

    Read the article

  • Enhance Your Browsing with Hyperwords in Firefox

    - by Asian Angel
    While browsing it is easy to find information that you would like to know more about, convert, or translate. The Hyperwords extension provides access to these types of resources and more in Firefox. Using Hyperwords Once the extension has finished installing you will be presented with a demo video that will let you learn more about how Hyperwords works. For our first example we chose to look for more information concerning “WASP-12b” using Wikipedia. Notice the small bluish circle on the lower right of the highlighted term…it is the default access for the Hyperwords menu (access by hovering mouse over it). If you hover over the the Wikipedia (or other) link you can access the information in a scrollable popup window. Or if you prefer click on the link to view the information in a new tab. Choose the style that best suits your needs. Hyperwords is extremely useful for quick unit conversions. Suppose you want to share a news story that you have found while browsing. Highlight the title, access Hyperwords, and choose your preferred sharing source. You may need to authorize access for Hyperwords to post to your account. Once you have authorized access you can start sharing those links very easily. This is just a small sampling of Hyperwords many useful features. Preferences Hyperwords has a nice set of preferences available to help you customize it. Alter the menu popup style, add or remove menu entries, and modify other functions for Hyperwords. Conclusion Hyperwords makes a nice addition to Firefox for anyone needing quick access to search, reference, translation, and other services while browsing. Links Download the Hyperwords extension (Mozilla Add-ons) Download the Hyperwords extension (Extension Homepage) Similar Articles Productive Geek Tips Switch to Private Browsing Mode Easily with Toggle Private BrowsingPreview Tabs in Firefox with Tab Preview 0.3You Really Want to Completely Disable Tabs in Firefox?Quick Hits: 11 Firefox Tab How-TosWhen to Use Protect Tab vs Lock Tab in Firefox TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 Outlook Tools, one stop tweaking for any Outlook version Zoofs, find the most popular tweeted YouTube videos Video preview of new Windows Live Essentials 21 Cursor Packs for XP, Vista & 7 Map the Stars with Stellarium Use ILovePDF To Split and Merge PDF Files

    Read the article

  • Building a Data Mart with Pentaho Data Integration Video Review by Diethard Steiner, Packt Publishing

    - by Compudicted
    Originally posted on: http://geekswithblogs.net/Compudicted/archive/2014/06/01/building-a-data-mart-with-pentaho-data-integration-video-review.aspx The Building a Data Mart with Pentaho Data Integration Video by Diethard Steiner from Packt Publishing is more than just a course on how to use Pentaho Data Integration, it also implements and uses the principals of the Data Warehousing (and I even heard the name of Ralph Kimball in the video). Indeed, a video watcher should be familiar with its concepts as the Star Schema, Slowly Changing Dimension types, etc. so I suggest prior to watching this course to consider skimming through the Data Warehouse concepts (if unfamiliar) or even better, read the excellent Ralph’s The Data Warehouse Tooolkit. By the way, the author expands beyond using Pentaho along to MySQL and MonetDB which is a real icing on the cake! Indeed, I even suggest the name of the course should be ‘Building a Data Warehouse with Pentaho’. To successfully complete the course one needs to know some Linux (Ubuntu used in the course), the VI editor and the Bash command shell, but it seems that similar requirements would also apply to the Weindows OS. Additionally, knowing some basic SQL would not hurt. As I had said, MonetDB is used in this course several times which seems to be not anymore complex than say MySQL, but based on what I read is very well suited for fast querying big volumes of data thanks to having a columnstore (vertical data storage). I don’t see what else can be a barrier, the material is very digestible. On this note, I must add that the author does not cover how to acquire the software, so here is what I found may help: Pentaho: the free Community Edition must be more than anyone needs to learn it. Or even go into a POC. MonetDB can be downloaded (exists for both, Linux and Windows) from http://goo.gl/FYxMy0 (just see the appropriate link on the left). The author seems to be using Eclipse to run SQL code, one can get it from http://goo.gl/5CcuN. To create, or edit database entities and/or schema otherwise one can use a universal tool called SQuirreL, get it from http://squirrel-sql.sourceforge.net.   Next, I must confess Diethard is very knowledgeable in what he does and beyond. However, there will be some accent heard to the user of the course especially if one’s mother tongue language is English, but it I got over it in a few chapters. I liked the rate at which the material is being presented, it makes me feel I paid for every second Eventually, my impressions are: Pentaho is an awesome ETL offering, it is worth learning it very much (I am an ETL fan and a heavy user of SSIS) MonetDB is nice, it tickles my fancy to know it more Data Warehousing, despite all the BigData tool offerings (Hive, Scoop, Pig on Hadoop), using the traditional tools still rocks Chapters 2 to 6 were the most fun to me with chapter 8 being the most difficult.   In terms of closing, I highly recommend this video to anyone who needs to grasp Pentaho concepts quick, likewise, the course is very well suited for any developer on a “supposed to be done yesterday” type of a project. It is for a beginner to intermediate level ETL/DW developer. But one would need to learn more on Data Warehousing and Pentaho, for such I recommend the 5 star Pentaho Data Integration 4 Cookbook. Enjoy it! Disclaimer: I received this video from the publisher for the purpose of a public review.

    Read the article

  • Building a Data Mart with Pentaho Data Integration Video Review by Diethard Steiner, Packt Publishing

    - by Compudicted
    Originally posted on: http://geekswithblogs.net/Compudicted/archive/2014/06/01/building-a-data-mart-with-pentaho-data-integration-video-review-again.aspx The Building a Data Mart with Pentaho Data Integration Video by Diethard Steiner from Packt Publishing is more than just a course on how to use Pentaho Data Integration, it also implements and uses the principals of the Data Warehousing (and I even heard the name of Ralph Kimball in the video). Indeed, a video watcher should be familiar with its concepts as the Star Schema, Slowly Changing Dimension types, etc. so I suggest prior to watching this course to consider skimming through the Data Warehouse concepts (if unfamiliar) or even better, read the excellent Ralph’s The Data Warehouse Tooolkit. By the way, the author expands beyond using Pentaho along to MySQL and MonetDB which is a real icing on the cake! Indeed, I even suggest the name of the course should be ‘Building a Data Warehouse with Pentaho’. To successfully complete the course one needs to know some Linux (Ubuntu used in the course), the VI editor and the Bash command shell, but it seems that similar requirements would also apply to the Windows OS. Additionally, knowing some basic SQL would not hurt. As I had said, MonetDB is used in this course several times which seems to be not anymore complex than say MySQL, but based on what I read is very well suited for fast querying big volumes of data thanks to having a columnstore (vertical data storage). I don’t see what else can be a barrier, the material is very digestible. On this note, I must add that the author does not cover how to acquire the software, so here is what I found may help: Pentaho: the free Community Edition must be more than anyone needs to learn it. Or even go into a POC. MonetDB can be downloaded (exists for both, Linux and Windows) from http://goo.gl/FYxMy0 (just see the appropriate link on the left). The author seems to be using Eclipse to run SQL code, one can get it from http://goo.gl/5CcuN. To create, or edit database entities and/or schema otherwise one can use a universal tool called SQuirreL, get it from http://squirrel-sql.sourceforge.net.   Next, I must confess Diethard is very knowledgeable in what he does and beyond. However, there will be some accent heard to the user of the course especially if one’s mother tongue language is English, but it I got over it in a few chapters. I liked the rate at which the material is being presented, it makes me feel I paid for every second Eventually, my impressions are: Pentaho is an awesome ETL offering, it is worth learning it very much (I am an ETL fan and a heavy user of SSIS) MonetDB is nice, it tickles my fancy to know it more Data Warehousing, despite all the BigData tool offerings (Hive, Scoop, Pig on Hadoop), using the traditional tools still rocks Chapters 2 to 6 were the most fun to me with chapter 8 being the most difficult.   In terms of closing, I highly recommend this video to anyone who needs to grasp Pentaho concepts quick, likewise, the course is very well suited for any developer on a “supposed to be done yesterday” type of a project. It is for a beginner to intermediate level ETL/DW developer. But one would need to learn more on Data Warehousing and Pentaho, for such I recommend the 5 star Pentaho Data Integration 4 Cookbook. Enjoy it! Disclaimer: I received this video from the publisher for the purpose of a public review.

    Read the article

  • Excel tables creation upon MySQL data import (new feature in MySQL for Excel 1.2.x)

    - by Javier Treviño
    In this blog post we are going to talk about one of the features included since MySQL for Excel 1.2.0, you can install the latest GA or maintenance version using the MySQL Installer or optionally you can download directly any GA or non-GA version from the MySQL Developer Zone. Remember how easy is to dump data from a MySQL table, view or stored procedure to an Excel worksheet? (If you don't you can check out this other post: How To - Guide to Importing Data from a MySQL Database to Excel using MySQL for Excel). In version 1.2.0 we introduced some advanced options for the Import MySQL Data operation regarding Excel tables. The Advanced Options dialog shown above is accessible from any Import Data dialog. When the Create an Excel table for the imported MySQL table data option is checked (which is by default), MySQL for Excel will create an Excel table (also known in Excel jargon as a ListObject) from the Excel range containing the imported MySQL data. This "little feature" enables the right-away usage of the Excel table in data analysis, like including it for summarization on a PivotTable, including a summarization row at the end of the table's data, sorting or filtering the table's data by clicking the drop-down button next to each column's header, among other actions. The Excel tables that are created automatically from imported MySQL data will have a name like [UserPrefix].<SchemaName>.<DbObjectName> for tables and views, and <Prefix>.<SchemaName>.<ProcedureName>.<ResultSetName> for stored procedures.  Notice the first piece of the name is an optional [UserPrefix], the prefix is only used if the Prefix Excel tables with the following text option is checked, notice that the suggested prefix is "MySQL" but it can be changed to whatever text is suitable for you. Excel tables must have a table style so they are easily identified. There are a lot of predefined Excel table styles, by default the MySqlDefault style is applied, which is the style you have seen applied to imported data for Edit Sessions, and which adds simple and elegant formatting to the table. If you wish to change it to any of the predefined Excel table style you can do it through the drop-down list on the Use style [[styles drop-down]] for the new Excel table option. Excel tables are the basic construction blocks for building data analysis or self-service Business Intelligence using other more advanced Excel tools like Power Pivot, Power View or Power Map. This feature empowers imported MySQL data to use it in more advanced ways.  We hope you give this and the other new features in the 1.2.x version family a try! Remember that your feedback is very important for us, so drop us a message and follow us: MySQL on Windows (this) Blog: https://blogs.oracle.com/MySqlOnWindows/ MySQL for Excel forum: http://forums.mysql.com/list.php?172 Facebook: http://www.facebook.com/mysql YouTube channel: https://www.youtube.com/user/MySQLChannel Cheers!

    Read the article

  • Why SQL Developer Rocks for the Advanced User Too

    - by thatjeffsmith
    While SQL Developer may be ‘perfect for Oracle beginners,’ that doesn’t preclude advanced and intermediate users from getting their fair share of toys! I’ve been working with Oracle since the 7.3.4 days, and I think it’s pretty safe to say that the WAY an ‘old timer’ uses a tool like SQL Developer is radically different than the ‘beginner.’ If you’ve been reluctant to use SQL Developer because it’s a GUI, give me a few minutes to try to convince you it’s worth a second (or third) look. 1. Help when you want it, and only when you want it One of the biggest gripes any user has with a piece of software is when said software can’t get out of it’s own way. When you’re typing in a word processor, sometimes you can do without the grammar and spelling checks, the offer to auto-complete your words, and all of the additional mark-up. This drives folks to programs like Notepad++ and vi. You can disable the code insight feature so you can type unmolested by SQL Developer’s attempt to auto-complete your object names. Now, if you happen to come across a long or hard to spell object name, you can still invoke the feature on demand using Ctrl+Spacebar Code Editor – Completion Insight – Enable Completion Auto-Popup (Keyword being Auto) 2. Automatic File Tracking SQL*Minus is nice. Vi is cool. Notepad++ has a lot of features I like. But not too many editors offer automatic logging of changes to your files without having to setup a source control system. I was doing some work on my login.sql. I’m not doing anything crazy, but seeing what I had done in previous iterations was helpful. Now imagine how nice it would be to have this available for your l,000+ line scripts! Track your scripts as they change, no setup required! 3. Extend the Functionality Know SQL and XML? Wish SQL Developer did JUST a little bit more? Build your own extensions. You can have custom context menus and object pages in just a few minutes. This is an example of lazy developers writing code that write code. 4. Get Your Money’s Worth You’ve licensed Enterprise Edition. You got your Diagnostic and Tuning packs. Now start using them! Not everyone has access to Enterprise Manager, especially developers. But that doesn’t mean they don’t need help with troubleshooting and optimizing poorly performing SQL statements. ASH, AWR, Real-Time SQL Monitoring and the SQL Tuning Advisor are built into the Reports and Worksheet. Yes you could make the package calls, but that’s a whole lot of typing, and I’d rather just get to the results. 5. Profile, Debug, & Unit Testing PLSQL An Interactive Development Environment (IDE) built by the same folks that own the programming language (Hello – Oracle PLSQL!) should be complete. It should ‘hug’ the developer and empower them to churn out programs that work, run fast, and are easy to maintain. Write it, test it, debug it, and tune it. When you’re running your programs and you just want to see the data that’s returned, that shouldn’t require any special settings or workaround to make it happen either. Magic! And a whole lot more… I could go on and talk about the support for things like DataPump, RMAN, and DBMS_SCHEDULER, but you’re experts and you’re plenty busy. If you think SQL Developer is falling short somewhere, I want you to let us know about it.

    Read the article

  • No HDMI audio in 13.04

    - by King84
    I have just upgraded from 12.10 to 13.04 and now everything works perfectly, except the fact that I have no audio via HDMI. I am using a Samsung tv-monitor connected via HDMI to my video card Asus EAH4670/DI/1GD3 (which has a Radeon HD 4670 gpu on it), installed phisically into my motherboard which is a MSI 770-C45. I am running kernel 3.9, I just have no sound. I tried downloading and installing https://code.launchpad.net/~ubuntu-audio-dev/+archive/alsa-daily/+files/oem-audio-hda-daily-dkms_0.201304261252~raring1_all.deb , but without any good result. Please help, I need my audio back. In the end, this is my lspci command output. ale@beast:~$ lspci 00:00.0 Host bridge: Advanced Micro Devices [AMD] nee ATI RX780/RX790 Host Bridge 00:02.0 PCI bridge: Advanced Micro Devices [AMD] nee ATI RD790 PCI to PCI bridge (external gfx0 port A) 00:06.0 PCI bridge: Advanced Micro Devices [AMD] nee ATI RD790 PCI to PCI bridge (PCI express gpp port C) 00:11.0 SATA controller: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 SATA Controller [IDE mode] 00:12.0 USB controller: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 USB OHCI0 Controller 00:12.1 USB controller: Advanced Micro Devices [AMD] nee ATI SB7x0 USB OHCI1 Controller 00:12.2 USB controller: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 USB EHCI Controller 00:13.0 USB controller: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 USB OHCI0 Controller 00:13.1 USB controller: Advanced Micro Devices [AMD] nee ATI SB7x0 USB OHCI1 Controller 00:13.2 USB controller: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 USB EHCI Controller 00:14.0 SMBus: Advanced Micro Devices [AMD] nee ATI SBx00 SMBus Controller (rev 3c) 00:14.1 IDE interface: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 IDE Controller 00:14.2 Audio device: Advanced Micro Devices [AMD] nee ATI SBx00 Azalia (Intel HDA) 00:14.3 ISA bridge: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 LPC host controller 00:14.4 PCI bridge: Advanced Micro Devices [AMD] nee ATI SBx00 PCI to PCI Bridge 00:14.5 USB controller: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 USB OHCI2 Controller 00:18.0 Host bridge: Advanced Micro Devices [AMD] Family 10h Processor HyperTransport Configuration 00:18.1 Host bridge: Advanced Micro Devices [AMD] Family 10h Processor Address Map 00:18.2 Host bridge: Advanced Micro Devices [AMD] Family 10h Processor DRAM Controller 00:18.3 Host bridge: Advanced Micro Devices [AMD] Family 10h Processor Miscellaneous Control 00:18.4 Host bridge: Advanced Micro Devices [AMD] Family 10h Processor Link Control 01:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI RV730 XT [Radeon HD 4670] 01:00.1 Audio device: Advanced Micro Devices [AMD] nee ATI RV710/730 HDMI Audio [Radeon HD 4000 series] 02:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168 PCI Express Gigabit Ethernet controller (rev 03) ale@beast:~$

    Read the article

  • So long Oracle ...

    - by arungupta
    ... and thanks for all the fish! This Friday (October 18, 2013) is my last day at Oracle. After Publishing almost 1400 blog entries with 5500+ comments on them Working in the Java EE team since inception Visiting 35+ countries and several cities around the world Speaking at all major Java conferences and lots of Java User Groups 15-year alumni of JavaOne as staff Meeting and working with best of the best in the Java community Most importantly having lots of fun Its time for me to move on! No new blog entries will be posted on this blog. Feel free to subscribe to The Aquarium for latest updates on Java EE and GlassFish. I'll continue to publish all the excellent content that you've been used to at blog.arungupta.me now onwards. Read my new blog to learn about my new adventures! Here are some of the conference badges collected over the past years ... And the cities visited ... View Cities Visited by "Miles To Go..." in a larger map The comments on this blog are disabled as I'll not be able to respond to them. Feel free to leave comments on the new blog and I'd love to follow up with you there. Thank you very much for all the support that has been shown on this blog. I'd like to conclude with a Hindi song that I've been humming for the past few days now ... Abhi alvida mat kaho doston ... Na jaane kahan phir mulaqaat ho ... Kyonki ... Beete huye lamhon ki kasak saath to hogi ... Khawabon mein hi ho chahe mulaqaat to hogi ... For my non-Hindi readers, here is my paraphrased meaning ... Don't say goodbye yet my friends ... We'll likely meet somewhere else ... Because ... We'll always have the memories of the wonderful time spent together ... May be in dreams but we will meet again ... With that, over and out, and see you at blog.arungupta.me!

    Read the article

  • Smooth Sailing or Rough Waters: Navigating Policy Administration Modernization

    - by helen.pitts(at)oracle.com
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Life insurance and annuity carriers continue to recognize the need to modernize their aging policy administration systems, but may be hesitant to move forward because of the inherent risk involved. To help carriers better prepare for what lies ahead LOMA's Resource Magazine asked Karen Furtado, partner of Strategy Meets Action, to help them chart a course in Navigating Policy Administration Selection, the cover story of this month’s issue. The industry analyst and research firm recently asked insurance carriers to name the business drivers for replacing legacy policy administration systems. The top five cited, according to Furtado, centered on: Supporting growth in current lines Improving competitive position Containing and reducing costs Supporting growth in new lines Supporting agent demands and interaction It’s no surprise that fueling growth, both now and in the future, continues to be a key driver for modernization. Why? Inflexible, hard-coded, legacy systems require customization by IT every time a change is required. This in turn impedes a carrier’s ability to be agile, constraining their ability to quickly adapt to changing regulatory requirements and evolving market demands. It also stymies their ability to quickly bring to market new products or rapidly configure changes to existing ones, and also can inhibit how carriers service customers and distribution channels. In the article, Furtado advised carriers to ensure that the policy administration system they are considering is current and modern, with an adaptable user interface and flexible service-oriented architecture. She said carriers to should ask themselves, “How much do you need flexibility and agility now and in the future? Does it support the business processes and rules that are needed for you to be able to create that adaptable environment?” Furtado went on to advise that carriers “Connect your strategy to your business and technical capabilities before you make investment choices…You want to enable your organization to transform for the future, not just automate the past.” Unlocking High Performance with Policy Administration Transformation also was the topic of a recent LOMA webcast moderated by Ron Clark, editor of LOMA's Resource Magazine. The web cast, which featured speakers from Oracle Insurance and Capgemini, focused on how insurers can competitively drive high performance by: Replacing a legacy policy administration system with a modern, flexible platform Optimizing IT and operations costs, creating consistent processes and eliminating resource redundancies Selecting the right partner with the best blend of technology, operational, and consulting capabilities to achieve market leadership Understanding the value of outsourcing closed block operations Learn more by clicking here to access this free, one-hour recorded webcast. Helen Pitts, is senior product marketing manager for Oracle Insurance's life and annuities solutions.

    Read the article

  • Travelling MVP #3: Community event in Varna, Bulgaria

    - by DigiMortal
    Second stop in my DevReach 2012 trip was at Varna. We had not much time to hang around there but this problem will get fixed next year if not before. But still we had sessions there with Dimitar Georgijev and I had also chance to meet local techies. Next time we will have more tech and beers for sure! We started in the morning from Bucharest and travelled through Ruse, Razgrad and Shumen to Varna. It’s about 275km. We used cab, local bus and Dimitar father’s car. We had one food stop in Ruse and after that we went directly to Varna. Here is our route on map. Varna is Bulgarian city that locates on western coast of Black Sea. I have been there once before this trip and it’s good place to have vacation under sun. Also autumn is there milder than here in Estonia (third day of snow is going on). Bulgaria has some good beers, my favorite mankind killer called rakia and very good national cuisine. Food is made of fresh stuff and it is damn good experience. Here are some arbitrarily selected images (you can click on these to view at original size): Old bus “monument” in Razgrad Stuffed peppers, Bulgarian national cuisine Infra-red community having good time and beers We made our sessions at one study class of Varna technical university. It’s a little bit old style university but everything we needed was there and we had no problems with machinery. Sessions were same as in Bucharest. The user group in Varna is brand new and hopefully it will be something bigger one good day. At least I try to make my commits so they get on their feet quicker. As we had not much time to announce the event there was about 15 guys listening to us and I’m happy that it was not too much hyped event because still I was getting my first experiences with foreign audiences. After sessions we took our stuff to hotel and went to hang around with local techies. We had some good time there and made some new friends. Next time when I go to Varna I go back as more experienced speaker and I plan to do there one tougher and highly challenging session. Maybe somebody from Estonian community will join me and then it will be well planned surprise-attack to Varna :)

    Read the article

  • Application Performance: The Best of the Web

    - by Michaela Murray
    Wisdom A deep understanding and realization […] resulting in the ability to apply perceptions, judgements and actions. It is also the comprehension of what is true coupled with optimum judgment as to action. - Wikipedia We’re writing a book for ASP.NET developers, and we want you to be a part of it. We know that there’s a huge amount of web developer wisdom that never gets shared, and we want to find those golden nuggets of knowledge and experience, and make sure everyone can learn from them. Right now, we want to find out about your top tips, hard-won lessons, and sage advice for avoiding, finding, and fixing application performance problems. If you work with .NET and SQL, even better – a lot of application performance relies on the interaction with the database, so we want to hear from you! “How Do You Want Me To Be Involved?” Right! Details! We want you, our most excellent readers, to email us with the Best Advice you would give to other developers for getting the best performance out of their applications. It doesn’t matter if your advice is for newbies or veterans, .NET or SQL – so long as it’s about application performance, we want to hear from you. (And if you think that there’s developer wisdom out there that “everyone knows”, a) I’m willing to bet you could find someone who doesn’t know about it, and b) it probably bears repeating anyway!) “I’m Interested. What Can You Do For Me?” Excellent question. For starters, there’s a chance to win a Microsoft Surface (the tablet, not the table-top). Once all the ASP.NET Wisdom has been collected, tallied, and labelled, it will then be weighed and measured by a team of expert judges (whose identities are still a closely-guarded secret).  The top tip in both SQL & .NET categories will each win their author their very own MS Surface. But that’s not all! We can also give you… immortality! More details? Ok. We’ll be collecting all of the tips sent in by our readers (and we can’t wait to learn from you all,) and with the help of our Simple-Talk editors, we will publish and distribute your combined and documented knowledge as a free, community-created, professionally typeset eBook. You will naturally be credited by name / pseudonym / twitter handle / GitHub username / StackOverflow profile / Whatever, as the clearly ingenious author of hot performance tips. The Not-Very-Fine Print Here’s the breakdown: We want to bring together the best application performance knowledge from ASP.NET developers. Closing date for submissions will be 9am GMT, December 4th. Submissions should be made by email – [email protected] Submissions will be judged by a panel of expert judges (who will be revealed soon). The top submission in both the SQL & .NET categories will each win a Microsoft Surface. ALL the tips which make it through the judging process will be polished by Simple-Talk editors, and turned into a professionally typeset eBook, which will be freely available, and promoted alongside the ANTS Performance Profiler tool. Anyone whose entry makes it into the book will be clearly and profusely credited in the method of their choice (or can remain anonymous.) The really REALLY short version Share what you know about ASP.NET application performance for a chance to win a Microsoft Surface, and then get your name credited in a slick eBook with top-notch production values. For more details, see above. We can’t wait to learn from you!

    Read the article

  • Ubuntu server spontaneous reboot

    - by user1941407
    I have got two ubuntu 12.04 servers(xeon e3). Sometimes(several days) each server spontaneously reboots. HDDs and other hardware are ok. Which logfile can help find a reason of the problem? UPDATED. hardware: xeon e3 processor, intel server motherboard, 32gb ddr3 ecc, mdadm mirror hdd raid for system, mdadm ssd raid for database(postgres). Both servers have similar (not identical) components. Smart is OK. It seems that the problem is in the software. Python process and database are running on this servers. Syslog (time of reboot): Aug 23 13:42:23 xeon hddtemp[1411]: /dev/sdc: WDC WD15NPVT-00Z2TT0: 34 C Aug 23 13:42:23 xeon hddtemp[1411]: /dev/sdd: WDC WD15NPVT-00Z2TT0: 34 C Aug 23 13:43:24 xeon hddtemp[1411]: /dev/sdc: WDC WD15NPVT-00Z2TT0: 34 C Aug 23 13:43:24 xeon hddtemp[1411]: /dev/sdd: WDC WD15NPVT-00Z2TT0: 34 C Aug 23 13:44:14 xeon sensord: Chip: acpitz-virtual-0 Aug 23 13:44:14 xeon sensord: Adapter: Virtual device Aug 23 13:44:14 xeon sensord: temp1: 27.8 C Aug 23 13:44:14 xeon sensord: temp2: 29.8 C Aug 23 13:44:14 xeon sensord: Chip: coretemp-isa-0000 Aug 23 13:44:14 xeon sensord: Adapter: ISA adapter Aug 23 13:44:14 xeon sensord: Physical id 0: 37.0 C Aug 23 13:44:14 xeon sensord: Core 0: 37.0 C Aug 23 13:44:14 xeon sensord: Core 1: 37.0 C Aug 23 13:44:14 xeon sensord: Core 2: 37.0 C Aug 23 13:44:14 xeon sensord: Core 3: 37.0 C Aug 23 13:44:24 xeon hddtemp[1411]: /dev/sdc: WDC WD15NPVT-00Z2TT0: 34 C Aug 23 13:44:24 xeon hddtemp[1411]: /dev/sdd: WDC WD15NPVT-00Z2TT0: 34 C Aug 23 13:47:01 xeon kernel: imklog 5.8.6, log source = /proc/kmsg started. Aug 23 13:47:01 xeon rsyslogd: [origin software="rsyslogd" swVersion="5.8.6" x-pid="582" x-info="http://www.rsyslog.com"] start Aug 23 13:47:01 xeon rsyslogd: rsyslogd's groupid changed to 103 Aug 23 13:47:01 xeon rsyslogd: rsyslogd's userid changed to 101 Aug 23 13:47:00 xeon rsyslogd-2039: Could not open output pipe '/dev/xconsole' [try http://www.rsyslog.com/e/2039 ] Aug 23 13:47:01 xeon kernel: [ 0.000000] Initializing cgroup subsys cpuset Aug 23 13:47:01 xeon kernel: [ 0.000000] Initializing cgroup subsys cpu Aug 23 13:47:01 xeon kernel: [ 0.000000] Initializing cgroup subsys cpuacct Aug 23 13:47:01 xeon kernel: [ 0.000000] Linux version 3.11.0-26-generic (buildd@komainu) (gcc version 4.6.3 (Ubuntu/Linaro 4.6.3-1ubuntu5) ) #45~precise1-Ubuntu SMP Tue Jul 15 04:02:35 UTC 2014 (Ubuntu 3.11.0-26.45~precise1-generic 3.11.10.12) Aug 23 13:47:01 xeon kernel: [ 0.000000] Command line: BOOT_IMAGE=/boot/vmlinuz-3.11.0-26-generic root=UUID=0daa7f53-6c74-47d2-873e-ebd339cd39b0 ro splash quiet vt.handoff=7 Aug 23 13:47:01 xeon kernel: [ 0.000000] KERNEL supported cpus: Aug 23 13:47:01 xeon kernel: [ 0.000000] Intel GenuineIntel Aug 23 13:47:01 xeon kernel: [ 0.000000] AMD AuthenticAMD Aug 23 13:47:01 xeon kernel: [ 0.000000] Centaur CentaurHauls Aug 23 13:47:01 xeon kernel: [ 0.000000] e820: BIOS-provided physical RAM map: Aug 23 13:47:01 xeon kernel: [ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000009bbff] usable Aug 23 13:47:01 xeon kernel: [ 0.000000] BIOS-e820: [mem 0x000000000009bc00-0x000000000009ffff] reserved Dmseg - nothing strange.

    Read the article

  • Live Debugging

    - by Daniel Moth
    Based on my classification of diagnostics, you should know what live debugging is NOT about - at least according to me :-) and in this post I'll share how I think of live debugging. These are the (outer) steps to live debugging Get the debugger in the picture. Control program execution. Inspect state. Iterate between 2 and 3 as necessary. Stop debugging (and potentially start new iteration going back to step 1). Step 1 has two options: start with the debugger attached, or execute your binary separately and attach the debugger later. You might say there is a 3rd option, where the app notifies you that there is an issue, referred to as JIT debugging. However, that is just a variation of the attach because that is when you start the debugging session: when you attach. I'll be covering in future posts how this step works in Visual Studio. Step 2 is about pausing (or breaking) your app so that it makes no progress and remains "frozen". A sub-variation is to pause only parts of its execution, or in other words to freeze individual threads. I'll be covering in future posts the various ways you can perform this step in Visual Studio. Step 3, is about seeing what the state of your program is when you have paused it. Typically it involves comparing the state you are finding, with a mental picture of what you thought the state would be. Or simply checking invariants about the intended state of the app, with the actual state of the app. I'll be covering in future posts the various ways you can perform this step in Visual Studio. Step 4 is necessary if you need to inspect more state - rinse and repeat. Self-explanatory, and will be covered as part of steps 2 & 3. Step 5 is the most straightforward, with 3 options: Detach the debugger; terminate your binary though the normal way that it terminates (e.g. close the main window); and, terminate the debugging session through your debugger with a result that it terminates the execution of your program too. In a future post I'll cover the ways you can detach or terminate the debugger in Visual Studio. I found an old picture I used to use to map the steps above on Visual Studio 2010. It is basically the Debug menu with colored rectangles around each menu mapping the menu to one of the first 3 steps (step 5 was merged with step 1 for that slide). Here it is in case it helps: Stay tuned for more... Comments about this post by Daniel Moth welcome at the original blog.

    Read the article

  • Best pathfinding for a 2D world made by CPU Perlin Noise, with random start- and destinationpoints?

    - by Mathias Lykkegaard Lorenzen
    I have a world made by Perlin Noise. It's created on the CPU for consistency between several devices (yes, I know it takes time - I have my techniques that make it fast enough). Now, in my game you play as a fighter-ship-thingy-blob or whatever it's going to be. What matters is that this "thing" that you play as, is placed in the middle of the screen, and moves along with the camera. The white stuff in my world are walls. The black stuff is freely movable. Now, as the player moves around he will constantly see "monsters" spawning around him in a circle (a circle that's larger than the screen though). These monsters move inwards and try to collide with the player. This is the part that's tricky. I want these monsters to constantly spawn, moving towards the player, but avoid walls entirely. I've added a screenshot below that kind of makes it easier to understand (excuse me for my bad drawing - I was using Paint for this). In the image above, the following rules apply. The red dot in the middle is the player itself. The light-green rectangle is the boundaries of the screen (in other words, what the player sees). These boundaries move with the player. The blue circle is the spawning circle. At the circumference of this circle, monsters will spawn constantly. This spawncircle moves with the player and the boundaries of the screen. Each monster spawned (shown as yellow triangles) wants to collide with the player. The pink lines shows the path that I want the monsters to move along (or something similar). What matters is that they reach the player without colliding with the walls. The map itself (the one that is Perlin Noise generated on the CPU) is saved in memory as two-dimensional bit-arrays. A 1 means a wall, and a 0 means an open walkable space. The current tile size is pretty small. I could easily make it a lot larger for increased performance. I've done some path algorithms before such as A*. I don't think that's entirely optimal here though.

    Read the article

  • Creating a Synchronous BPEL composite using File Adapter

    - by [email protected]
    By default, the JDeveloper wizard generates asynchronous WSDLs when you use technology adapters. Typically, a user follows these steps when creating an adapter scenario in 11g: 1) Create a SOA Application with either "Composite with BPEL" or an "Empty Composite". Furthermore, if  the user chooses "Empty Composite", then he or she is required to drop the "BPEL Process" from the "Service Components" pane onto the SOA Composite Editor. Either way, the user comes to the screen below where he/she fills in the process details. Please note that the user is required to choose "Define Service Later" as the template. 2) Creates the inbound service and outbound references and wires them with the BPEL component:     3) And, finally creates the BPEL process with the initiating <receive> activity to retrieve the payload and an <invoke> activity to write the payload.     This is how most BPEL processes that use Adapters are modeled. And, if we scrutinize the generated WSDL, we can clearly see that the generated WSDL is one way and that makes the BPEL process asynchronous (see below)   In other words, the inbound FileAdapter would poll for files in the directory and for every file that it finds there, it would translate the content into XML and publish to BPEL. But, since the BPEL process is asynchronous, the adapter would return immediately after the publish and perform the required post processing e.g. deletion/archival and so on.  The disadvantage with such asynchronous BPEL processes is that it becomes difficult to throttle the inbound adapter. In otherwords, the inbound adapter would keep sending messages to BPEL without waiting for the downstream business processes to complete. This might lead to several issues including higher memory usage, CPU usage and so on. In order to alleviate these problems, we will manually tweak the WSDL and BPEL artifacts into synchronous processes. Once we have synchronous BPEL processes, the inbound adapter would automatically throttle itself since the adapter would be forced to wait for the downstream process to complete with a <reply> before processing the next file or message and so on. Please see the tweaked WSDL below and please note that we have converted the one-way to a two-way WSDL and thereby making the WSDL synchronous: Add a <reply> activity to the inbound adapter partnerlink at the end of your BPEL process e.g.   Finally, your process will look like this:   You are done.   Please remember that such an excercise is NOT required for Mediator since the Mediator routing rules are sequential by default. In other words, the Mediator uses the caller thread (inbound file adapter thread) for processing the routing rules. This is the case even if the WSDL for mediator is one-way.

    Read the article

  • Thinktecture.IdentityServer RC

    - by Your DisplayName here!
    I just uploaded the RC of IdentityServer to Codeplex. This release is feature complete and if I don’t get any bug reports this is also pretty much the final V1. Changes from B1 The configuration data access is now based on EF 4.1 code first. This makes it much easier to use different data stores. For RTM I will also provide a SQL script for SQL Server so you can move the configuration to a separate machine (e.g. for load balancing scenarios). I included the ASP.NET Universal Providers in the download. This adds official support for SQL Azure, SQL Server and SQL Compact for the membership, roles and profile features. Unfortunately the Universal Provider use a different schema than the original ASP.NET providers (that sucks btw!) – so I made them optional. If you want to use them go to web.config and uncomment the new provider. The relying party registration entries now have added fields to add extra data that you want to couple with the RP. One use case could be to give the UI a hint how the login experience should look like per RP. This allows to have a different look and feel for different relying parties. I also included a small helper API that you can use to retrieve the RP record based on the incoming WS-Federation query string. WS-Federation single sign out is now conforming to the spec. Certificate based endpoint identities for SSL endpoints are optional now. Added a initial configuration “wizard”. This sets up the signing certificate, issuer URI and site title on the first run. Installation This is still a “developer” release – that means it ships with source code that you have to build it etc. But from that point it should be a little more straightforward as it used to be: Make sure SSL is configured correctly for IIS Map the WebSite directory to a vdir in IIS Run the web site. This should bring up the initial configuration Make sure the worker process account has access to the signing certificate private key Make sure all your users are in the “IdentityServerUsers” role in your role store. Administrators need the “IdentityServerAdministrators” role That should be it. A proper documentation will be hopefully available soon (any volunteers?). Please provide feedback! thanks!

    Read the article

  • F1 Pit Pragmatics

    - by mikef
    "I hate computers. No, really, I hate them. I love the communications they facilitate, I love the conveniences they provide to my life. but I actually hate the computers themselves." - Scott Merrill, 'I hate computers: confessions of a Sysadmin' If Scott's goal was to polarize opinion and trigger raging arguments over the 'real reasons why computers suck', then he certainly succeeded. Impassioned vitriol sits side-by-side with rational debate. Yet Scott's fundamental point is absolutely on the money - Computers are a means to an end. The IT industry is finally starting to put weight behind the notion that good User Experience is an absolutely crucial goal, a cause championed by the likes of Microsoft's Bill Buxton, and which Apple's increasingly ubiquitous touch screen interface exemplifies. However, that doesn't change the fact that, occasionally, you just have to man up and deal with complex systems. In fact, sometimes you just need to sacrifice everything else in the name of performance. You'll find a perfect example of this Faustian bargain in Trevor Clarke's fascinating look into the (diabolical) IT infrastructure of modern F1 racing - high performance, high availability. high everything. To paraphrase, each car has up to 100 sensors, transmitting around 30Gb of data over the course of a race (70% in real-time). This data is then processed by no less than 3 servers (per car) so that the engineers in the pit have access to telemetry, strategy information, timing feeds, a connection back to the operations room in the team's home base - the list goes on. All of this while the servers are exposed "to carbon dust, oil, vibration, rain, heat, [and] variable power". Now, this is admittedly an extreme context where there's no real choice but to use complex systems where ease-of-use is, at best, a secondary concern. The flip-side is seen in small-scale personal computing such as that seen in Apple's iDevices, which are incredibly intuitive but limited in their scope. In terms of what kinds of systems they prefer to use, I suspect that most SysAdmins find themselves somewhere along this axis of Power vs. Usability, and which end of this axis you resonate with also hints at where you think the IT industry should focus its energy. Do you see yourself in the F1 pit, making split-second decisions, wrestling with information flows and reticent hardware to bend them to your will? If so, I imagine you feel that computers are subtle tools which need to be tuned and honed, using the advanced knowledge possessed only by responsible SysAdmins (If you have an iPhone, I suspect it's jail-broken). If the machines throw enigmatic errors, it's the price of flexibility and raw power. Alternatively, would you prefer to have your role more accessible, with users empowered by knowledge, spreading the load of managing IT environments? In that case, then you want hardware and software to have User Experience as their primary focus, and are of the "means to an end" school of thought (you're probably also fed up with users not listening to you when you try and help). At its heart, the dichotomy is between raw power (which might be difficult to use) and ease-of-use (which might have some limitations, but you can be up and running immediately). Of course, the ultimate goal is a fusion of flexibility, power and usability all in one system. It's achievable in specific software environments, and Red Gate considers it a target worth aiming for, but in other cases it's a goal right up there with cold fusion. I think it'll be a long time before we see it become ubiquitous. In the meantime, are you Power-Hungry or a Champion of Usability? Cheers, Michael Francis Simple Talk SysAdmin Editor

    Read the article

  • Using CTAS & Exchange Partition Replace IAS for Copying Partition on Exadata

    - by Bandari Huang
    Usage Scenario: Copy data&index from one partition to another partition in a partitioned table. Solution: Create a partition definition Copy data from one partition to another partiton by 'Insert as select (IAS)' Create a nonpartitioned table by 'Create table as select (CTAS)' Convert a nonpartitioned table into a partition of partitoned table by exchangng their data segments. Rebuild unusable index Exchange Partition Convertion Mutual convertion between a partition (or subpartition) and a nonpartitioned table Mutual convertion between a hash-partitioned table and a partition of a composite *-hash partitioned table Mutual convertiton a [range | list]-partitioned table into a partition of a composite *-[range | list] partitioned table. Exchange Partition Usage Scenario High-speed data loading of new, incremental data into an existing partitioned table in DW environment Exchanging old data partitions out of a partitioned table, the data is purged from the partitioned table without actually being deleted and can be archived separately Exchange Partition Syntax ALTER TABLE schema.table EXCHANGE [PARTITION|SUBPARTITION] [partition|subprtition] WITH TABLE schema.table [INCLUDE|EXCLUDING] INDEX [WITH|WITHOUT] VALIDATION UPDATE [INDEXES|GLOBAL INDEXES] INCLUDING | EXCLUDING INDEXES Specify INCLUDING INDEXES if you want local index partitions or subpartitions to be exchanged with the corresponding table index (for a nonpartitioned table) or local indexes (for a hash-partitioned table). Specify EXCLUDING INDEXES if you want all index partitions or subpartitions corresponding to the partition and all the regular indexes and index partitions on the exchanged table to be marked UNUSABLE. If you omit this clause, then the default is EXCLUDING INDEXES. WITH | WITHOUT VALIDATION Specify WITH VALIDATION if you want Oracle Database to return an error if any rows in the exchanged table do not map into partitions or subpartitions being exchanged. Specify WITHOUT VALIDATION if you do not want Oracle Database to check the proper mapping of rows in the exchanged table. If you omit this clause, then the default is WITH VALIDATION.  UPADATE INDEX|GLOBAL INDEX Unless you specify UPDATE INDEXES, the database marks UNUSABLE the global indexes or all global index partitions on the table whose partition is being exchanged. Global indexes or global index partitions on the table being exchanged remain invalidated. (You cannot use UPDATE INDEXES for index-organized tables. Use UPDATE GLOBAL INDEXES instead.) Exchanging Partitions&Subpartitions Notes Both tables involved in the exchange must have the same primary key, and no validated foreign keys can be referencing either of the tables unless the referenced table is empty.  When exchanging partitioned index-organized tables: – The source and target table or partition must have their primary key set on the same columns, in the same order. – If key compression is enabled, then it must be enabled for both the source and the target, and with the same prefix length. – Both the source and target must be index organized. – Both the source and target must have overflow segments, or neither can have overflow segments. Also, both the source and target must have mapping tables, or neither can have a mapping table. – Both the source and target must have identical storage attributes for any LOB columns. 

    Read the article

  • XNA 2D line-of-sight check

    - by bionicOnion
    I'm working on a top-down shooter in XNA, and I need to implement line-of-sight checking. I've come up with a solution that seems to work, but I get the nagging feeling that it won't be efficient enough to do every frame for multiple calls (the game already hiccups slightly at about 10 calls per frame). The code is below, but my general plan was to create a series of rectangles with a width and height of zero to act as points along the sight line, and then check to see if any of these rectangles intersects a ClutterObject (an interface I defined for things like walls or other obstacles) after first screening for any that can't possibly be in the line of sight (i.e. behind the viewer) or are too far away (a concession I made for efficiency). public static bool LOSCheck(Vector2 pos1, Vector2 pos2) { Vector2 currentPos = pos1; Vector2 perMove = (pos2 - pos1); perMove.Normalize(); HashSet<ClutterObject> clutter = new HashSet<ClutterObject>(); foreach (Room r in map.GetRooms()) { if (r != null) { foreach (ClutterObject c in r.GetClutter()) { if (c != null &&!(c.GetRectangle().X * perMove.X < 0) && !(c.GetRectangle().Y * perMove.Y < 0)) { Vector2 cVector = new Vector2(c.GetRectangle().X, c.GetRectangle().Y); if ((cVector - pos1).Length() < 1500) clutter.Add(c); } } } } while (currentPos != pos2 && ((currentPos - pos1).Length() < 1500)) { Rectangle position = new Rectangle((int)currentPos.X, (int)currentPos.Y, 0, 0); foreach (ClutterObject c in clutter) { if (position.Intersects(c.GetRectangle())) return false; } currentPos += perMove; } return true; } I'm sure that there's a better way to do this (or at least a way to make this method more efficient), but I'm not too used to XNA yet, so I figured it couldn't hurt to bring it here. At the very least, is there an efficient to determine which objects may be in front of the viewer with greater precision than the rather broad 90 degree window I've given myself?

    Read the article

  • Top 5 Sites and Activities in San Francisco to Experience During Oracle OpenWorld

    - by kgee
    While Oracle OpenWorld may provide solutions and information on topics like how to simplify your IT, the importance of cloud, and what types of storage may satisfy your enterprise needs, who is going to tell you more about San Francisco? Here are some suggested sites and activities to experience after OpenWorld that aren’t too far from the Moscone Center. It is recommended to take a cab for the sake of time, but the 6 square miles that make up San Francisco will make for a quick trek to any of the following destinations: The Golden Gate BridgeAn image often associated with San Francisco, this bridge is one of the most impressive in the world. Take a walk across it, or view it from nearby Crissy Field, it is a sight that floors even the most veteran of San Franciscans. The Ferry BuildingLocated at the end of Market Street in the Embarcadero, the Ferry Building once served as a hub of water transport and trade. The building has a bay front view and an array of food choices and restaurants. It is easily accessible via the Muni, BART, trolley or by cab. It is a must-see in San Francisco, and not too far from the Moscone Center. Ride the Trolley to the CastroFor only $2, you can get go back in history for a moment on the Trolley. Take the F-line from the Embarcadero and ride it all the way to the Castro district. During the ride, you will get an overview of the landscape and cultures that are prevalent in San Francisco, but be wary that some areas may beg for an open mind more than others. Golden Gate ParkWhen you tire of the concrete jungle, the lucky part of being in San Francisco is that you can escape to a natural refuge, this park being one of the favorites. This park is known for its hiking trails, cultural attractions, monuments, lakes and gardens. It is one good reason to bring your sneakers to San Francisco, and is also a great place to picnic. Please be wary that it is easy to get lost, and it is advisable to bring a map (just in case) if you go. Haight AshburyFor a complete change of scenery, Haight Ashbury is known as one of the places hippies used to live and the location of "The Summer of Love." It is now a more affluent neighborhood with boutique shops and the occasional drum circle. While it may be perceived as grungy in certain spots, it is one of the most photographed places in San Francisco and an integral part of San Franciscan history.

    Read the article

  • Improve Playback Using Enhancements in Windows Media Player 12

    - by DigitalGeekery
    Are you looking for ways to improve the playback of your media in Windows Media Player 12? We’ll show you how to do that by using the enhancements in WMP 12. If you are in Library mode, you’ll need to click the icon at the lower right to switch to Now Playing mode. Right-click anywhere in Media Player while in Now Playing mode, select Enhancements, and select any of the available options.   You can switch between the individual enhancements by clicking the right and left buttons at the top left.   Crossfading and Auto Volume Leveling The Auto Volume Leveling setting is just a simple toggle on and off. If your MP3 or WMA files have volume leveling information values.   You can automatically add volume leveling information values to all files you add to your library by switching to Library view, going to Tools > Options, and selecting Add volume leveling information values for new files on the Library tab. Click OK when finished.   Crossfading will gradually decrease the volume of the song that is ending (fade out) and increase volume of the song that is beginning. Click Turn on Crossfading and then click and drag the slider left or right change the amount of overlap between tracks. Graphic Equalizer The graphic equalizer is toggled on and off by clicking Turn on / Turn off at the top left. You can select pre-defined equalizer settings by music genre by clicking the Default list. The radio buttons on the left allow you to move the sliders individually, in a loose group or a tight group. You can always return to the default settings by clicking Reset. Play Speed Settings Choose a pre-defined settings by clicking Slow, Normal, or Fast. Uncheck the Snap slider to common speeds the move the slider right and left to your desired speed. If nothing else, these settings provide a little fun and amusement. Quiet Mode Quiet mode will level out any sharp volume highs and lows within a single track. Simply toggle the setting on or off and select whether you prefer Medium difference or Little difference by selecting one of the radio buttons. SRS WOW effects SRS WOW effects enhance low-frequency and stereo sound performance. Click Turn on to enable the TruBass and WOW Effect sliders. You can also optimize for your speaker type. Click to switch between Regular, Large, and Headphones. Video Settings Video Settings allow you to adjust the Hue, Brightness, Saturation, and Contrast.   You can also adjust the zoom settings by clicking Select video zoom settings.   Dolby Digital Settings Choose between Normal, Night, and Theater settings to adjust the audio for Dolby Digital content. This setting will only effect media with Dolby Digital sound. Looking for more ways to improve your media experience in WMP 12? Check out how to update metadata and cover art and how to share media with other Windows 7 computers on your home network. Similar Articles Productive Geek Tips Fixing When Windows Media Player Library Won’t Let You Add FilesInstall and Use the VLC Media Player on Ubuntu LinuxHow To Rip a Music CD in Windows 7 Media CenterStream Media from Windows 7 to XP with VLC Media PlayerInstalling Windows Media Player Plugin for Firefox TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Acronis Online Backup DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows Check these Awesome Chrome Add-ons iFixit Offers Gadget Repair Manuals Online Vista style sidebar for Windows 7 Create Nice Charts With These Web Based Tools Track Daily Goals With 42Goals Video Toolbox is a Superb Online Video Editor

    Read the article

  • Use Your Favorite Wallpapers in Windows 7 Starter Edition

    - by Asian Angel
    If you have Windows 7 Starter Edition installed on your netbook, the default wallpaper can get old. If you are tired of looking at the default wallpaper, then join us today as we look at changing it with Oceanis Change Background Windows 7. Special Notes This information is quoted directly from the website and needs to be kept in mind when using Oceanis Change Background Windows 7: If the Oceanis Change Background Windows 7 program no longer works properly after installing some Windows Updates, then uninstall and reinstall the Oceanis Change Background Windows 7 program to have it run properly again. If you ever do an in-place upgrade to another higher level edition of Windows 7 in the future, then be sure to uninstall this Oceanis Change Background Windows 7 program first to avoid incompatibility issues with it in the new edition of Windows 7. It was designed to only work in Windows 7 Starter edition. Before There it is…the default wallpaper everyone with the Starter Edition gets stuck with. Some people may not mind it, but if you are one of the people who really wants something different then get ready to rejoice. After The install file for Oceanis is contained in a zip file so you will need to unzip it to get started. The install process is quick and simple but you will need to do a system restart afterwards. Once you have restarted your computer this is what your screen will look like…do not panic and think that this is all there is to it. This is just the Starter Screen and can be easily changed… Note: Oceanis will auto-start with Windows each time. Using either the Desktop Icon or the Start Menu Entry, open up the Oceanis Main Window. You will see the set of four default wallpapers shown here. At this point the best thing to do is browse for the appropriate folder where you have all of those wonderful new wallpapers just waiting to be used. Note: We found Stretch to be the best Picture Position setting on our system. For our example we had three ready and waiting. We decided to try out the Wallpaper Slideshow feature first. We chose a time frame and saved our changes. Here are our three wallpapers as they switched through. This can be much more interesting than the default wallpaper. There was only one quirk that we encountered while using the Slideshow Setting. On occasion if we minimized a non-maximized window there would be a leftover partial image in place of the window. Our suggestion? Go with one wallpaper at a time and the settings shown below. These are the settings that we had terrific luck with…Only one picture selected, Picture Position = Stretch, & Change Picture Every = Every Day. Using these settings, the Starter Edition acted just like any of the other editions with regard to wallpaper management. Conclusion If you have grown tired of looking at the default wallpaper in Windows 7 Starter Edition then you will certainly appreciate what Oceanis Change Background Windows 7 can do to fix that problem. For more ways to customize your Windows 7 Started Edition, be sure to to check out how to personalize Windows 7 Starter. Links Download Oceanis Change Background Windows 7 Similar Articles Productive Geek Tips Windows 7 Welcome Screen Taking Forever? Here’s the Fix (Maybe)Awesome Desktop Wallpapers: The Windows 7 EditionHow To Customize Wallpaper in Windows 7 Starter EditionDesktop Fun: Starship Theme WallpapersDesktop Fun: Underwater Theme Wallpapers TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Acronis Online Backup DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows Vista style sidebar for Windows 7 Create Nice Charts With These Web Based Tools Track Daily Goals With 42Goals Video Toolbox is a Superb Online Video Editor Fun with 47 charts and graphs Tomorrow is Mother’s Day

    Read the article

  • ArchBeat Link-o-Rama Top 10 for November 4-10, 2012

    - by Bob Rhubart
    The Top 10 most popular items shared via the OTN ArchBeat Facebook Page for the week of November 4-10, 2012. OAM/OVD JVM Tuning | @FusionSecExpert Vinay from the Oracle Fusion Middleware Architecture Group (the very prolific A-Team) shares a process for analyzing and improving performance in Oracle Virtual Directory and Oracle Access Manager. Exploring Lambda Expressions for the Java Language and the JVM | Java Magazine In the latest //Java/Architect column in Java Magazine, Ben Evans, Martijn Verburg, and Trisha Gee explain how, "although Lambda expressions might seem unfamiliar to begin with, they're quite easy to pick up, and mastering them will be vital for writing applications that can take full advantage of modern multicore CPUs." SOA Galore: New Books for Technical Eyes Only Shake up up your technical skills with this trio of new technical books from community members covering SOA and BPM. Oracle Solaris 11.1 update focuses on database integration, cloud | Mark Fontecchio TechTarget editor Mark Fontecchio reports on the recent Oracle Solaris 11.1 release, with comments from IDC's Al Gillen. Solving Big Problems in Our 21st Century Information Society | Irving Wladawsky-Berger "I believe that the kind of extensive collaboration between the private sector, academia and government represented by the Internet revolution will be the way we will generally tackle big problems in the 21st century. Just as with the Internet, governments have a major role to play as the catalyst for many of the big projects that the private sector will then take forward and exploit. The need for high bandwidth, robust national broadband infrastructures is but one such example." — Irving Wladawsky-Berger ADF Mobile Custom Javasciprt – iFrame Injection | John Brunswick The ADF Mobile Framework provides a range of out of the box components to add within your AMX pages, according to John Brunswick. But what happens when "an out of the box component does not directly fulfill your development need? What options are available to extend your application interface?" John has an answer. Architects Matter: Making sense of the people who make sense of enterprise IT Why do architects matter? Oracle Enterprise Architect Eric Stephens suggests that you ask yourself this question the next time you take the elevator to the Oracle offices on the 45th floor of the Willis Tower in Chicago, Illinois (or any other skyscraper, for that matter). If you had to take the stairs to get to those offices, who would you blame? "You get the picture," he says. "Architecture is essential for any necessarily complex structure, be it a building or an enterprise." (Read the article...) Converting SSL certificate generated by a 3rd party to an Oracle Wallet | Paulo Albuquerque Oracle Fusion Middleware A-Team member Paulo Albuquerque shares "a workaround to get your private key, certificate and CA trusted certificates chain into Oracle Wallet." How Data and BPM are married to get the right information to the right people at the right time | Leon Smiers "Business Process Management…supports a large group of stakeholders within an organization, all with different needs," says Oracle ACE Leon Smiers. "End-to-end processes typically run across departments, stakeholders and applications, and can often have a long life-span. So how do organizations provide all stakeholders with the information they need?" Leon provides answers in this post. Updated Business Activity Monitoring (BAM) Class | Gary Barg Oracle SOA Team blogger Gary Barg has news for those interested in a skills upgrade. This updated Oracle University course "explains how to use Oracle BAM to monitor enterprise business activities across an enterprise in real time. You can measure your key performance indicators (KPIs), determine whether you are meeting service-level agreements (SLAs), and take corrective action in real time." Thought for the Day "For every complex problem, there is a solution that is simple, neat, and wrong." — H. L. Mencken (September 12, 1880 – January 29, 1956) Source: SoftwareQuotes.com

    Read the article

< Previous Page | 396 397 398 399 400 401 402 403 404 405 406 407  | Next Page >