Search Results

Search found 7834 results on 314 pages for 'jinal desai live'.

Page 69/314 | < Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >

  • Limiting Audit Exposure and Managing Risk – Q&A and Follow-Up Conversation

    - by Tanu Sood
    Thanks to all who attended the live ISACA webcast on Limiting Audit Exposure and Managing Risk with Metrics-Driven Identity Analytics. We were really fortunate to have Don Sparks from ISACA moderate the webcast featuring Stuart Lincoln, Vice President, IT P&L Client Services, BNP Paribas, North America and Neil Gandhi, Principal Product Manager, Oracle Identity Analytics. Stuart’s insights given the team’s role in providing IT for P&L Client Services and his tremendous experience in identity management and establishing sustainable compliance programs were true value-add at yesterday’s webcast. And if you are a healthcare organization looking to solve your compliance and security challenges, we recommend you join us for a live webcast on Tuesday, November 29 at 10 am PT. The webcast will feature experts from Kaiser Permanente, PricewaterhouseCoopers and Oracle and the focus of the discussion will be around the compliance challenges a healthcare organization faces and best practices for tackling those. Here are the details: Healthcare IT News Webcast: Managing Risk and Enforcing Compliance in Healthcare with Identity Analytics Tuesday, November 29, 201110:00 a.m. PT / 1:00 p.m. ET Register Today The ISACA webcast replay is now available on-demand and the slides are also available for download. Since we didn’t have time to address all the questions we received during the live Q&A portion of the webcast, we have captured responses to the remaining questions here. Please continue to provide us your feedback and insights from your experience in deploying identity compliance solutions. Q. Can you please clarify the mechanism utilized to populate the Identity Warehouse from each individual application's access management function / files? A. Oracle Identity Analytics (OIA) supports direct imports from applications. Data collection is based on Extract, Transform and Load (ETL) that eliminates the need to write connectors to different applications. Oracle Identity Analytics’ import engine supports complex entitlement feeds saved as either text files or XML. The imports can be scheduled on a periodic basis or triggered as needed. If the applications are synchronized with a user provisioning solution like Oracle Identity Manager, Oracle Identity Analytics has a seamless integration to pull in data from Oracle Identity Manager. Q.  Can you provide a short summary of the new features in your latest release of Oracle Identity Analytics? A. Oracle recently announced availability of enhanced Oracle Identity Analytics. This release focused on easing the certification process by offering risk analytics driven certification, advanced certification screens, business centric views and significant improvement in performance including 3X faster data imports, 3X faster certification campaign generation and advanced auto-certification features, that  will allow organizations to improve user productivity by up to 80%. Closed-loop risk feedback and IT policy monitoring with Oracle Identity Manager, a leading user provisioning solution, allows for more accurate certification reviews. And, OIA's improved performance enables customers to scale compliance initiatives supporting millions of user entitlements across thousands of applications, whether on premise or in the cloud, without compromising speed or integrity. Q. Will ISACA grant a CPE credit for attending this ISACA-sponsored webinar today? A. From ISACA: Hello and thank you for your interest in the 2011 ISACA Webinar Program!  Unfortunately, there are no CPEs offered for this program, archived or live.  We will be looking into the feasibility of offering them in the future.  Q. Would you be able to use this to help manage licenses for software? That is to say - could it track software that is not used by a user, thus eliminating the software license? A. OIA’s integration with Oracle Identity Manager, a leading user provisioning solution, allows organizations to detect ghost accounts or unused accounts via account reconciliation. Based on company’s policies, this could trigger an automated workflow for account deletion or asking for further investigation. Closed-loop feedback between the two solutions would then allow visibility into the complete audit trail of when the account was detected, the action taken, by whom, when and the current status. Q. We have quarterly attestations and .xls mechanisms are not working. Once the identity data is correlated in Identity Analytics, do you then automate access certification? A. OIA’s identity warehouse analyzes and correlates identity data across various resources that allows OIA to determine a user’s risk profile, who the access review request should go to, along with all the relevant access details of the user. The access certification manager gets notification on what to review, when and the relevant data is presented in a business friendly screen. Based on the result of the access certification process, actions are triggered and results recorded and archived. Access review managers have visual risk indicators that also allow them to prioritize access certification tasks and efforts. Q. How does Oracle Identity Analytics work with Cloud Security? A. For enterprises looking to build their own cloud(s), Oracle offers a set of security services that cloud developers can leverage including Oracle Identity Analytics.  For enterprises looking to manage their compliance requirements but without hosting those in-house and instead having a hosting provider offer managed Identity Management services to the organizations, Oracle Identity Analytics can be leveraged much the same way as you’d in an on-premise (within the enterprise) environment. In fact, organizations today are leveraging Oracle Identity Analytics to manage identity compliance in both these ways. Q. Would you recommend this as a cost effective solution for a smaller organization with @ 2,500 users? A. The key return-on-investment (ROI) on Oracle Identity Analytics is derived from automating compliance processes thereby eliminating administrative overhead, minimizing errors, maintaining cost- and time-effective sustainable compliance processes and minimizing audit exposures and penalties.  Of course, there are other tangible benefits that are derived from an Oracle Identity Analytics implementation as outlined in the webcast. For a quantitative analysis of your requirements and potential ROI calculation, we recommend you refer to the Forrester Study on Total Economic Impact of Oracle Identity Analytics. For an in-person discussion, please email Richard Caldwell.

    Read the article

  • Our Geek Trivia App for Windows 8 is Now Available Everywhere

    - by The Geek
    When we first released our Geek Trivia app, it was sadly only available in the US store for Windows 8, but now you can get it no matter where you live. It’s completely free, so get your copy right now! For those that aren’t familiar with this app—it’s really quite simple: if you want your daily dose of Geek Trivia to show up on a Live Tile on your Windows 8 Start Screen, get this app. We’ve also got quizzes, loads of previous trivia, and we’ll be adding even more features in the near future. Our Geek Trivia App for Windows 8 is Now Available Everywhere How To Boot Your Android Phone or Tablet Into Safe Mode HTG Explains: Does Your Android Phone Need an Antivirus?

    Read the article

  • SQL SERVER – Script to Update a Specific Column in Entire Database

    - by Pinal Dave
    Last week, I have received a very interesting question and I find in email and I really liked the question as I had to play around with SQL Script for a while to come up with the answer he was looking for. Please read the question and I believe that all of us face this kind of situation. “Pinal, In our database we have recently introduced ModifiedDate column in all of the tables. Now onwards any update happens in the row, we are updating current date and time to that field. Now here is the issue, when we added that field we did not update it with a default value because we were not sure when we will go live with the system so we let it be NULL. Now modification to the application went live yesterday and we are now updating this field. Here is where I need your help. We need to update all the tables in our database where we have column created ModifiedDate and now want to update with current datetime. As our system is already live since yesterday there are several thousands of the rows which are already updated with real world value so we do not want to update those values. Essentially, in our entire database where ever there is a ModifiedDate column and if it is NULL we want to update that with current date time?  Do you have a script for it?” Honestly I did not have such a script. This is very specific required but I was able to come up with two different methods how he can use this method. Method 1 : Using INFORMATION_SCHEMA SELECT 'UPDATE ' + T.TABLE_SCHEMA + '.' + T.TABLE_NAME + ' SET ModifiedDate = GETDATE() WHERE ModifiedDate IS NULL;' FROM INFORMATION_SCHEMA.TABLES T INNER JOIN INFORMATION_SCHEMA.COLUMNS C ON T.TABLE_NAME = C.TABLE_NAME AND c.COLUMN_NAME ='ModifiedDate' WHERE T.TABLE_TYPE = 'BASE TABLE' ORDER BY T.TABLE_SCHEMA, T.TABLE_NAME; Method 2: Using DMV SELECT 'UPDATE ' + SCHEMA_NAME(t.schema_id) + '.' + t.name + ' SET ModifiedDate = GETDATE() WHERE ModifiedDate IS NULL;' FROM sys.tables AS t INNER JOIN sys.columns c ON t.OBJECT_ID = c.OBJECT_ID WHERE c.name ='ModifiedDate' ORDER BY SCHEMA_NAME(t.schema_id), t.name; Above scripts will create an UPDATE script which will do the task which is asked. We can pretty much the update script to any other SELECT statement and retrieve any other data as well. Click to Download Scripts Reference: Pinal Dave (http://blog.sqlauthority.com)  Filed under: PostADay, SQL, SQL Authority, SQL Joins, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Lower Your Application Infrastructure Costs w/Oracle Database 11g

    - by john.brust
    Oracle Database 11g is designed to support enterprise applications, including Oracle E-Business Suite, Oracle PeopleSoft, and Oracle Siebel. And every Oracle customer can benefit from the performance, reliability, and security that Oracle Database 11g brings to these applications. Plus, Oracle Database 11g, helps you drive down your IT infrastructure costs. Join us next Friday for a webcast conversation with database expert Mark Townsend, Vice President of Oracle's Server Technology Division, to learn how you can benefit from running your applications on Oracle Database 11g. At the end of the presentation, we'll open up for live Q&A for approximately 30 minutes. Register now for our Friday, April 23rd, 2010 9:30am PT | 12:30pm ET live webcast.

    Read the article

  • SQLMidlands & SQLLunch

    - by Dave Ballantyne
    Many thanks to all those that turned out to see my presentation on Thursday (16th of Feb) of “Cursors are Evil” at SQLMidlands.  The scripts i used are here : https://skydrive.live.com/?cid=4004b6a3bc887e2c&id=4004B6A3BC887E2C%21216 You will need the AdventureWorks2008r2 release to run these, feel free to mail me ([email protected]) with any questions.  They are based upon a series of articles I wrote for SQLServerCentral which can be found here and here. Also I am starting ,or at least having an attempt at, a new user group in London.  This is SQLLunch, meeting downstairs at The Golden Fleece , EC4N 1SP which is 2 minutes from Bank Tube , we will have a twice monthly meeting (2nd and 4th Tuesdays) for an ‘All Stuff, No Fluff’ event.  Put plainly, a quick hello followed by a 45 minute presentation , which will ,optimistically, have you there and back to your desk within a lunch hour. Registrations for the first series of dates are at sqlserverfaq.com If you would like to speak, then please get in touch. Hope to see you there. 

    Read the article

  • XNA Notes 001

    - by George Clingerman
    Just a quick recap of things I noticed going on in or around the XNA community this past week. I’m sure there’s a lot I missed (it’s a pretty big community with lots of different parts to it) but these where the things I caught that I thought were pretty cool. The XNA Team Michael Klucher gave a list of books every gamer should read. http://twitter.com/#!/mklucher/status/22313041135673344 Shawn Hargreaves posted Nelxon Studio posting about a cheatsheet for converting 3.1 to 4.0 http://blogs.msdn.com/b/shawnhar/archive/2011/01/04/xna-3-1-to-4-0-cheat-sheet.aspx?utm_source=twitterfeed&utm_medium=twitter XNA Game Studio won the Frontline award for Programming Tool by GameDev magazine! Congrats to the XNA team! http://www.gdmag.com/homepage.htm XNA MVPs In January several MVPs were up for re-election, Jim Perry, Andy ‘The ZMan’ Dunn, Glenn Wilson and myself were all re-award a Microsoft MVP award for their contributions to the XNA/DirectX communities. https://mvp.support.microsoft.com/communities/mvp.aspx?product=1&competency=XNA%2fDirectX A movement to get Michael McLaughlin an MVP award has started and you can join in too! http://twitter.com/#!/theBigDaddio/status/22744458621620224 http://www.xnadevelopment.com/MVP/MichaelMcLaughlinMVP.txt Don’t forget you can nominate ANYONE for a MVP award, that’s how they work. https://mvp.support.microsoft.com/gp/mvpbecoming  XNA Developers James Silva of Ska Studios hit 9,200 sales of ZP2KX and recommends you listen to Infected Mushroom. http://twitter.com/#!/Jamezila/status/22538865357094912 http://en.wikipedia.org/wiki/Infected_Mushroom Noogy creator of the upcoming XBLA title Dust an Elysian tail posts some details into his art creation. http://noogy.com/image/statue/statue.html Xbox LIVE Indie Game News Microsoft posts acknowledging there was an issue with the sales data that has been addressed and apologized for not posting about it sooner. http://forums.create.msdn.com/forums/p/71347/436154.aspx#436154 Winter Uprising sales still chugging along and being updated by Xalterax (by those developers willing to actually share sales numbers. Thanks for sharing guys, much appreciated!) http://forums.create.msdn.com/forums/t/70147.aspx Don’t forget about Dream Build Play coming up in February! http://www.dreambuildplay.com/Main/Home.aspx The Best Xbox LIVE Indie Games December Edition comes out on NeoGaf http://www.neogaf.com/forum/showthread.php?t=414485 The Greatest XBox LIVE Indie Games of 2010 on DealSpwn – Congrats to DrMistry and MStarGames for his #1 spot with his massive XBLIG Space Pirates From Tomorrow! http://www.dealspwn.com/xbligoty-2010/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Dealspwn+%28Dealspwn%29 XNA Game Development The future of XACT and WP7 has finally been confirmed and we finally know what our options are for looping audio seamlessly on WP7. http://forums.create.msdn.com/forums/p/61826/436639.aspx#436639  Super Mario 3 Design Notes is an interesting read for XBLIG developers, giving some insight to the training that natural occurs for players as they start playing the game. Good things for XBLIG developers to think about. http://www.significant-bits.com/super-mario-bros-3-level-design-lessons

    Read the article

  • ING: Scaling Role Management and Access Certification to Thousands of Applications

    - by Tanu Sood
    Organizations deal with employee and user access certifications in different ways.  There’s collation of multiple spreadsheets, an intense two-week exercise by managers or use of access certification tools to do so across a handful of applications. But for most organizations compliance is about certifying user access for thousands of employees across hundreds of systems. Managing and auditing millions of entitlement combinations on a periodic basis poses a huge scale challenge. ING solved the compliance scale challenge using an Identity Platform approach. Join the live webcast featuring ING’s enterprise architect, Mark Robison, as he discusses how a platform approach offers value that is greater than the sum of its parts and enables ING to successfully meet their security and compliance goals. Mark will also share his implementation experiences and discuss the key requirements to manage the complexity and scale of access certification efforts at ING. Mark will be joined by Neil Gandhi, Principal Product Manager for Oracle Identity Analytics. Live WebcastING: Scaling Role Management and Access Certification to Thousands of ApplicationsWednesday, April 11th at 10 am Pacific/ 1 pm EasternRegister Today

    Read the article

  • set installer background / not able to run early_command in custom preseed file (precise)

    - by user73093
    I have a custom preseed file for a Precise Live CD (which is well loaded on boot, I checked syslog for that). My initial problem is that when booting in install mode (default behavior for a Live CD), ubiquity runs X with a default wallpaper which is hardcoded to /usr/share/backgrounds/warty-final-ubuntu.png in Ubiquity code. So my idea was to run early_command (https://help.ubuntu.com/12.04/installation-guide/i386/preseed-advanced.html) to copy my custom wallpaper over /usr/share/backgrounds/warty-final-ubuntu.png. Assuming my custom wallpaper allready resides on the rootfs in /usr/share/backgrounds. But... It seems the early_command never runs (and I'm sure the preseed file is taken into account) Here is what I have added to my preseed file: d-i preseed/early_command string cp /usr/share/backgrounds/mywallpaper-defaults.jpg /usr/share/backgrounds/warty-final-ubuntu.png Even this one is never run: d-i preseed/early_command string /usr/bin/touch /tmp/testearly Thanks for helping !!

    Read the article

  • Availability Best Practices on Oracle VM Server for SPARC

    - by jsavit
    This is the first of a series of blog posts on configuring Oracle VM Server for SPARC (also called Logical Domains) for availability. This series will show how to how to plan for availability, improve serviceability, avoid single points of failure, and provide resiliency against hardware and software failures. Availability is a broad topic that has filled entire books, so these posts will focus on aspects specifically related to Oracle VM Server for SPARC. The goal is to improve Reliability, Availability and Serviceability (RAS): An article defining RAS can be found here. Oracle VM Server for SPARC Principles for Availability Let's state some guiding principles for availability that apply to Oracle VM Server for SPARC: Avoid Single Points Of Failure (SPOFs). Systems should be configured so a component failure does not result in a loss of application service. The general method to avoid SPOFs is to provide redundancy so service can continue without interruption if a component fails. For a critical application there may be multiple levels of redundancy so multiple failures can be tolerated. Oracle VM Server for SPARC makes it possible to configure systems that avoid SPOFs. Configure for availability at a level of resource and effort consistent with business needs. Effort and resource should be consistent with business requirements. Production has different availability requirements than test/development, so it's worth expending resources to provide higher availability. Even within the category of production there may be different levels of criticality, outage tolerances, recovery and repair time requirements. Keep in mind that a simple design may be more understandable and effective than a complex design that attempts to "do everything". Design for availability at the appropriate tier or level of the platform stack. Availability can be provided in the application, in the database, or in the virtualization, hardware and network layers they depend on - or using a combination of all of them. It may not be necessary to engineer resilient virtualization for stateless web applications applications where availability is provided by a network load balancer, or for enterprise applications like Oracle Real Application Clusters (RAC) and WebLogic that provide their own resiliency. It's (often) the same architecture whether virtual or not: For example, providing resiliency against a lost device path or failing disk media is done for the same reasons and may use the same design whether in a domain or not. It's (often) the same technique whether using domains or not: Many configuration steps are the same. For example, configuring IPMP or creating a redundant ZFS pool is pretty much the same within the guest whether you're in a guest domain or not. There are configuration steps and choices for provisioning the guest with the virtual network and disk devices, which we will discuss. Sometimes it is different using domains: There are new resources to configure. Most notable is the use of alternate service domains, which provides resiliency in case of a domain failure, and also permits improved serviceability via "rolling upgrades". This is an important differentiator between Oracle VM Server for SPARC and traditional virtual machine environments where all virtual I/O is provided by a monolithic infrastructure that itself is a SPOF. Alternate service domains are widely used to provide resiliency in production logical domains environments. Some things are done via logical domains commands, and some are done in the guest: For example, with Oracle VM Server for SPARC we provide multiple network connections to the guest, and then configure network resiliency in the guest via IP Multi Pathing (IPMP) - essentially the same as for non-virtual systems. On the other hand, we configure virtual disk availability in the virtualization layer, and the guest sees an already-resilient disk without being aware of the details. These blogs will discuss configuration details like this. Live migration is not "high availability" in the sense of "continuous availability": If the server is down, then you don't live migrate from it! (A cluster or VM restart elsewhere would be used). However, live migration can be part of the RAS (Reliability, Availability, Serviceability) picture by improving Serviceability - you can move running domains off of a box before planned service or maintenance. The blog Best Practices - Live Migration on Oracle VM Server for SPARC discusses this. Topics Here are some of the topics that will be covered: Network availability using IP Multipathing and aggregates Disk path availability using virtual disks defined with multipath groups ("mpgroup") Disk media resiliency configuring for redundant disks that can tolerate media loss Multiple service domains - this is probably the most significant item and the one most specific to Oracle VM Server for SPARC. It is very widely deployed in production environments as the means to provide network and disk availability, but it can be confusing. Subsequent articles will describe why and how to configure multiple service domains. Note, for the sake of precision: an I/O domain is any domain that has a physical I/O resource (such as a PCIe bus root complex). A service domain is a domain providing virtual device services to other domains; it is almost always an I/O domain too (so it can have something to serve). Resources Here are some important links; we'll be drawing on their content in the next several articles: Oracle VM Server for SPARC Documentation Maximizing Application Reliability and Availability with SPARC T5 Servers whitepaper by Gary Combs Maximizing Application Reliability and Availability with the SPARC M5-32 Server whitepaper by Gary Combs Summary Oracle VM Server for SPARC offers features that can be used to provide highly-available environments. This and the following blog entries will describe how to plan and deploy them.

    Read the article

  • Virtual Developer Day: Oracle Fusion Developmen

    - by kellsey.ruppel
    Virtual Developer Day: Oracle Fusion Development Register now for this FREE hands-on online workshop Get up to date and learn everything you wanted to know about Oracle ADF & Fusion Development plus live Q&A chats with Oracle technical staffOracle Application Development Framework (ADF) is the standards based, strategic framework for Oracle Fusion Applications and Oracle Fusion Middleware.  Oracle ADF’s integration with the Oracle SOA Suite, Oracle WebCenter and Oracle BI creates a complete productive development platform for your custom applications. Join us at this FREE virtual event and learn the latest in Fusion Development including: Is Oracle ADF development faster and simpler than Forms, Apex or .Net? Mobile Application Development with ADF Mobile Oracle ADF development with Eclipse Oracle WebCenter Portal and ADF Development Application Lifecycle Management with ADF Building Process Centric Applications with ADF and BPM Oracle Business Intelligence and ADF Integration Live Q&A chats with Oracle technical staff Developer lead, manager or architect – this event has something for everyone. Don’t miss this opportunity.  Tuesday, July 10, 20129:00 a.m. PT. – 1:00 p.m. PT11:00 a.m. CT – 3:00 p.m. CT12:00 p.m. ET – 4:00 p.m. ET1:00 p.m. BRT – 5:00 p.m. BRT Register online now! for this FREE event Agenda 9:00 a.m. Opening 9:30 a.m. Keynote: Oracle Fusion Development Track 1Introduction to Fusion Development Track 2What's New in Fusion Development Track 3Fusion Development in the Enterprise 10:00 a.m. Is Oracle ADF Development Faster and Simpler than Oracle Forms, APEX or .Net? Mobile Application Development with ADF Mobile Oracle WebCenter Portal and ADF Development 11:00 a.m. Rich Web UI made simple – an ADF Faces Overview Oracle Enterprise Pack for Eclipse - ADF Development Building Process Centric Applications with ADF and BPM 12:00 noon Next Generation Controller for JSF Application Lifecycle Management for ADF Oracle Business Intelligence and ADF Integration *Hands On Lab – WebCenter and ADF Lab w/ JDeveloper – Lab materials will be provided ahead of the event to give you ample time to work through the lab and increase the productivity of the live chat sessions the day of the event. Sessions abstractsRegister online now! for this FREE event Copyright © 2012, Oracle Corporation and/or its affiliates.All rights reserved. Contact Us | Legal Notices and Terms of Use | Privacy Statement

    Read the article

  • June Edition - Oracle Database Insider

    - by jgelhaus
    Now available.  The June edition of the Oracle Database Insider includes: NEWS June 10: Oracle CEO Larry Ellison Live on the Future of Database Performance At a live webcast on June 10 at Oracle’s headquarters, Oracle CEO Larry Ellison is expected to announce the upcoming availability of Oracle Database In-Memory, which dramatically accelerates business decision-making by processing analytical queries in memory without requiring any changes to existing applications. Read More New Study Confirms Capital Expenditure Savings with Oracle Multitenant A new study finds that Oracle Multitenant, an option of Oracle Database 12c, drives significant savings in capital expenditures by enabling the consolidation of a large number of databases on the same number or fewer hardware resources. Read More VIDEO Oracle Database 12c: Multitenant Environment with Tom Kyte Tom Kyte discusses Oracle Multitenant, followed by a demo of the multitenant architecture that includes moving a pluggable database (PDB) from one multitenant container database to another, cloning a PDB, and creating a new PDB.  and much more.

    Read the article

  • Webcast: Introducing the New Oracle VM Blade Cluster Reference Configuration

    - by Ferhat Hatay
    The Fastest Way to Virtualize Your Datacenter Join our webcast “Best Practices for Speeding Virtual Infrastructure Deployment with Oracle VM” Tues., January 25, 2011 9 a.m. PT / 12 p.m. ET Presented by: Marc Shelley, Senior Manager, Oracle Blades Product Management Tom Lisjac, Senior Member, Oracle Technical Staff Register now for our live webcast! The Oracle VM blade cluster reference configuration addresses the key challenges associated with deploying a virtualization infrastructure. It eliminates or significantly reduces the assembly and integration of the following components BY UP TO 98%: Servers Storage Network Connections Virtualization Software Operating Systems Attend this fact-filled, how-to Webcast with Oracle experts to learn the best practices for deploying the reference configuration for Oracle VM Server for x86 and Sun Blade and Sun Fire x86 rack mount servers. Virtualization is easier than ever with this new configuration. Register now for our live webcast! For more information, see: Oracle white paper: Accelerating deployment of virtualized infrastructures with the Oracle VM blade cluster reference configuration Oracle technical white paper: Best Practices and Guidelines for Deploying the Oracle VM Blade Cluster Reference Configuration

    Read the article

  • Customer Webcast: Alcatel-Lucent Creates a Modern User Experience

    - by [email protected]
    Today, customer satisfaction is critical to a company's long-term success. With customers searching the internet to find new solutions and offerings, it's more important than ever to deliver a modern and engaging user experience that's both interactive and community-based. Join us on June 30th for this exclusive LIVE Webcast with Saeed Hosseiniyar, CIO of Alcatel-Lucent's Enterprise Products Group, and Andy MacMillan, Vice President of Product Management for Oracle's Enterprise 2.0 Solutions. You'll learn how a modern customer service portal with integrated Web 2.0 and social media features can: Improve customer satisfaction by delivering rich, personalized and interactive content Speed product development by facilitating participation and feedback from customers through online communities Improve ROI with a unified platform that delivers content to employees, partners and customers You'll walk away with concrete strategies, best practices and real-world insights on how to transform your company's brand with a next-generation customer service and support site. Register today for this complimentary live Webcast!

    Read the article

  • Grub bootloader goes fail to boot in dual boot system after installing Ubuntu 12.10

    - by K.K Patel
    I had 12.04 with my dual boot system. Yesterday I downloaded Ubuntu 12.10 make bootable USB and choose Upgrade option in Installer. After installation Grub failed to boot my machine. I tried following to fix grub bootloder . Same problem I fixed with Ubuntu 12.04 using live USB but this solution not work for Ubuntu 12.10. Now coming at exactly where this solution goes fail. I followed this steps after booting Live USB and opening terminal. 1) sudo fdisk -l to see where Linux is installed 2) sudo mount /dev/sda9 /mnt where sda9 is my linux partition 3) sudo mount /dev/sda9 /mnt/boot 4) sudo mount --bind /dev /mnt/dev 5) sudo chroot /mnt (No problems with this steps done perfectly) 6) grub-install /dev/sda when I type command I got error that source_dir doesn't exist Please specify --target or --directory How can I solve this ?

    Read the article

  • How to publish paid Android apps if you're not from US/UK

    - by Sheikh Aman
    I was pretty excited while creating one of my apps but as it turns out you can't actually sign up for Google Checkout if you don't live either in the USA or in the UK. And since Google Checkout is the only way Android Market will pay you, all my efforts seem to be going in vain. So because I live in India, I can't sell my apps. I tried contacting Google by various means on this, but haven't got any response so far. I tried searching the web as well just to find out that one can't be paid via any other way. I am pretty sure that many people here might have gone through the same problem. How did you solve it? I have a PayPal account and an AdSense account as well. Can they help in any way? And if nothing works out, how am I supposed to be selling my app?

    Read the article

  • BizTalk 2009 - SQL Server Job Configuration

    - by StuartBrierley
    Following the installation of Biztalk Server 2009 on my development laptop I used the BizTalk Server Best Practice Analyser which highlighted the fact that two of the SQL Server Agent jobs that BizTalk relies on were not running successfully.  Upon investigation it turned out that these jobs needed to be configured before they would run successfully. To configure these jobs open SQL Server Management Studio, expand SQL Server Agent > Jobs and double click on the appropriate job.  Select Steps and then edit the appropriate entries. Backup BizTalk Server (BizTalkMgmtDb) This job is comprised of three steps BackupFull, MarkAndBackupLog and ClearBackupHistory. BackupFull exec [dbo].[sp_BackupAllFull_Schedule] ‘d’ /* Frequency */,‘BTS’ /* Name */,‘<destination path>’ /* location of backup files */ The frequency here is set/left as daily The name is left as BTS You must provide a full destination path for the backup files to be stored. There are also two optional parameters: A flag that controls if the job forces a full backup if a partial backup fails A parameter to control the time of day to run the full backup; the default is midnight UTC time For example: exec [dbo].[sp_BackupAllFull_Schedule] ‘d’ /* Frequency */,‘BTS’ /* Name */,‘<destination path>’ /* location of backup files */ , 0, 22 MarkAndBackUpLog exec [dbo].[sp_MarkAll] ‘BTS’ /* Log mark name */,’<destination path>’  /*location of backup files */ You must provide a destination path for the log backups. Optionally you can also add an extra parameter that tells the procedure to use local time: exec [dbo].[sp_MarkAll] ‘BTS’ /* Log mark name */,’<destination path>’  /*location of backup files */ ,1 Clear Backup History exec [dbo].[sp_DeleteBackupHistory] @DaysToKeep=7 This will clear out the instances in the MarkLog table older than 7 days.    DTA Purge and Archive (BizTalkDTADb) This job is comprised of a single step. Archive and Purge exec dtasp_BackupAndPurgeTrackingDatabase 0, --@nLiveHours tinyint, 1, --@nLiveDays tinyint = 0, 30, --@nHardDeleteDays tinyint = 0, null, --@nvcFolder nvarchar(1024) = null, null, --@nvcValidatingServer sysname = null, 0 --@fForceBackup int = 0 Any completed instance that is older than the live days plus live hours will be deleted, as will any associated data. Any data older than the HardDeleteDays will be deleted - this means that those long running orchestration instances that would otherwise never be purged will at some point have their data cleared down while allowing the instance to continue, thus preventing the DTA databse from growing indefinitely.  This should always be greater than the soft purge window. The NVC folder is the path for the backup files, if this is null the job will not run failing with the error : DTA Purge and Archive (BizTalkDTADb) Job failed SQL Server Management Studio, job activity monitor, view history The @nvcFolder parameter cannot be null. Archive and Purge step How long you choose to keep instances in the Tracking Database is really up to you. For development I have set this up as: exec dtasp_BackupAndPurgeTrackingDatabase 0, 1, 30, ’<destination path>’, null, 0 On a live server you may want to adjust these figures: exec dtasp_BackupAndPurgeTrackingDatabase 0, 15, 20, ’<destination path>’, null, 0

    Read the article

  • netbook alternate installation/update

    - by user11847
    I have an Asipre One D255E netbook. Installed 9.10 sucessfully, but no internet connections to upgrade to 10.04 or 10.10. Have 10.10 alternate (couldnt get 10.04). However it says that no cd-rom present (netbook via live usb), and I directed it to sdb1 but that did not work. Could someone guide me to the steps to installation via alternate ubs only (& no internet)? The live usb's of 10.04 & 10.10 internet connections worked, but installation hanged (non alternate). Thank you greatly in advance.

    Read the article

  • Daylight saving time: Annoying and pointless [closed]

    - by polemon
    Daylight saving time is a big annoyance for me. Not just from the standpoint, that I never know when we set our clocks an hour ahead or an hour back. Setting the clock ahead or back disturbs my time organization, and is responsible for my bad mood around that day. From the standpoint of a programmer, it's no less annoying. you always have to check whether it isn't "that date" in the year, when you have to work with local time. I hear people have the same views on this that I have. also, I don't see any benefits from it. The supposedly added "extra hour" of sunlight; I don't feel that. In case you live in a region where daylight savings is observed (like in Germany, where I live), please tell me how you manage the annoyances that come with it, and (if possible) how to get rid of it, once and for all...

    Read the article

  • Impossible to install Ubuntu 10.10 dual boot with Windows 7 on new Acer desktop computer

    - by Don Myers
    My brother has a brand new Acer Desktop with Windows 7. I have done many installs (40+) of Ubuntu starting with 8.10, and have never run into this. I've spent three hours trying to do a dual boot install of 10.10. When you get to the place where you normally would choose to install as a dual boot or overwrite the existing information on the hard drive, that block is just blank. Nothing. No choices even to do a manual partition setup. If you try to go on you get the message "No root file system is defined. Please correct this from the partitioning menu." but there is nothing in the partitioning menu. I tried a good 10.04 disc also. Same thing happens with it. I ran a gparted live cd, and it shows the hard drive as sda with 3 partitions on the original. sda1 is a small partition called PQService. sda2 is another small partition called System Reserved, and GParted says it is the boot partition. sda3 is the main partation with the operating system (Windows 7) and all of the empty space. There is a little unallocated space at the very beginning and very end of the hard drive. If I go to places in the Live CD, it shows a 640 gb hard disk called Acer, but it also shows a 640 gb hard disk called system reserved. They are the same disk. There is just one hard drive. If you click properties in the System Reserved 640 gb, it shows all information as unknown. I had to change the boot order in the bios in order to run the live cd. The hard drive instead of being listed as such is listed as Raid:Raid Ready. Something the way this computer is set up is preventing Ubuntu from being able to identify the hard drive partitions at all to do an install, even if you were not doing a dual boot and just wanted to overwrite Windows. Is this a bug that needs reported? This is a major problem for me and my brother, but also for Ubuntu if new users want to Ubuntu and find they cannot install it.

    Read the article

  • Questions and Answers from Today's Upgrade Webcast with Roy Swonger

    - by margaret hamburger
    Thanks for attending today's live webcast, 3 Compelling Reasons to Upgrade to Oracle Database 11g with Oracle Database Upgrade Expert Roy Swonger. We had an amazing turnout this morning and responded to more than 40 of your upgrade questions that I just couldn't wait to share with you. Don't worry if you missed today's live webcast. You can register for the On-Demand version to learn about Oracle Database 11g upgrade best practices with real customer examples, plus loads of great upgrade resources for making database upgrades faster and easier. You can also download a copy of our webcast presentation.

    Read the article

  • How Mature is Your Database Change Management Process?

    - by Ben Rees
    .dbd-banner p{ font-size:0.75em; padding:0 0 10px; margin:0 } .dbd-banner p span{ color:#675C6D; } .dbd-banner p:last-child{ padding:0; } @media ALL and (max-width:640px){ .dbd-banner{ background:#f0f0f0; padding:5px; color:#333; margin-top: 5px; } } -- Database Delivery Patterns & Practices Further Reading Organization and team processes How do you get your database schema changes live, on to your production system? As your team of developers and DBAs are working on the changes to the database to support your business-critical applications, how do these updates wend their way through from dev environments, possibly to QA, hopefully through pre-production and eventually to production in a controlled, reliable and repeatable way? In this article, I describe a model we use to try and understand the different stages that customers go through as their database change management processes mature, from the very basic and manual, through to advanced continuous delivery practices. I also provide a simple chart that will help you determine “How mature is our database change management process?” This process of managing changes to the database – which all of us who have worked in application/database development have had to deal with in one form or another – is sometimes known as Database Change Management (even if we’ve never used the term ourselves). And it’s a difficult process, often painfully so. Some developers take the approach of “I’ve no idea how my changes get live – I just write the stored procedures and add columns to the tables. It’s someone else’s problem to get this stuff live. I think we’ve got a DBA somewhere who deals with it – I don’t know, I’ve never met him/her”. I know I used to work that way. I worked that way because I assumed that making the updates to production was a trivial task – how hard can it be? Pause the application for half an hour in the middle of the night, copy over the changes to the app and the database, and switch it back on again? Voila! But somehow it never seemed that easy. And it certainly was never that easy for database changes. Why? Because you can’t just overwrite the old database with the new version. Databases have a state – more specifically 4Tb of critical data built up over the last 12 years of running your business, and if your quick hotfix happened to accidentally delete that 4Tb of data, then you’re “Looking for a new role” pretty quickly after the failed release. There are a lot of other reasons why a managed database change management process is important for organisations, besides job security, not least: Frequency of releases. Many business managers are feeling the pressure to get functionality out to their users sooner, quicker and more reliably. The new book (which I highly recommend) Lean Enterprise by Jez Humble, Barry O’Reilly and Joanne Molesky provides a great discussion on how many enterprises are having to move towards a leaner, more frequent release cycle to maintain their competitive advantage. It’s no longer acceptable to release once per year, leaving your customers waiting all year for changes they desperately need (and expect) Auditing and compliance. SOX, HIPAA and other compliance frameworks have demanded that companies implement proper processes for managing changes to their databases, whether managing schema changes, making sure that the data itself is being looked after correctly or other mechanisms that provide an audit trail of changes. We’ve found, at Red Gate that we have a very wide range of customers using every possible form of database change management imaginable. Everything from “Nothing – I just fix the schema on production from my laptop when things go wrong, and write it down in my notebook” to “A full Continuous Delivery process – any change made by a dev gets checked in and recorded, fully tested (including performance tests) before a (tested) release is made available to our Release Management system, ready for live deployment!”. And everything in between of course. Because of the vast number of customers using so many different approaches we found ourselves struggling to keep on top of what everyone was doing – struggling to identify patterns in customers’ behavior. This is useful for us, because we want to try and fit the products we have to different needs – different products are relevant to different customers and we waste everyone’s time (most notably, our customers’) if we’re suggesting products that aren’t appropriate for them. If someone visited a sports store, looking to embark on a new fitness program, and the store assistant suggested the latest $10,000 multi-gym, complete with multiple weights mechanisms, dumb-bells, pull-up bars and so on, then he’s likely to lose that customer. All he needed was a pair of running shoes! To solve this issue – in an attempt to simplify how we understand our customers and our offerings – we built a model. This is a an attempt at trying to classify our customers in to some sort of model or “Customer Maturity Framework” as we rather grandly term it, which somehow simplifies our understanding of what our customers are doing. The great statistician, George Box (amongst other things, the “Box” in the Box-Jenkins time series model) gave us the famous quote: “Essentially all models are wrong, but some are useful” We’ve taken this quote to heart – we know it’s a gross over-simplification of the real world of how users work with complex legacy and new database developments. Almost nobody precisely fits in to one of our categories. But we hope it’s useful and interesting. There are actually a number of similar models that exist for more general application delivery. We’ve found these from ThoughtWorks/Forrester, from InfoQ and others, and initially we tried just taking these models and replacing the word “application” for “database”. However, we hit a problem. From talking to our customers we know that users are far less further down the road of mature database change management than they are for application development. As a simple example, no application developer, who wants to keep his/her job would develop an application for an organisation without source controlling that code. Sure, he/she might not be using an advanced Gitflow branching methodology but they’ll certainly be making sure their code gets managed in a repo somewhere with all the benefits of history, auditing and so on. But this certainly isn’t the case (yet) for the database – a very large segment of the people we speak to have no source control set up for their databases whatsoever, even at the most basic level (for example, keeping change scripts in a source control system somewhere). By the way, if this is you, Red Gate has a great whitepaper here, on the barriers people face getting a source control process implemented at their organisations. This difference in maturity is the same as you move in to areas such as continuous integration (common amongst app developers, relatively rare for database developers) and automated release management (growing amongst app developers, very rare for the database). So, when we created the model we started from scratch and biased the levels of maturity towards what we actually see amongst our customers. But, what are these stages? And what level are you? The table below describes our definitions for four levels of maturity – Baseline, Beginner, Intermediate and Advanced. As I say, this is a model – you won’t fit any of these categories perfectly, but hopefully one will ring true more than others. We’ve also created a PDF with a flow chart to help you find which of these groups most closely matches your team:  Download the Database Delivery Maturity Framework PDF here   Level D1 – Baseline Work directly on live databases Sometimes work directly in production Generate manual scripts for releases. Sometimes use a product like SQL Compare or similar to do this Any tests that we might have are run manually Level D2 – Beginner Have some ad-hoc DB version control such as manually adding upgrade scripts to a version control system Attempt is made to keep production in sync with development environments There is some documentation and planning of manual deployments Some basic automated DB testing in process Level D3 – Intermediate The database is fully version-controlled with a product like Red Gate SQL Source Control or SSDT Database environments are managed Production environment schema is reproducible from the source control system There are some automated tests Have looked at using migration scripts for difficult database refactoring cases Level D4 – Advanced Using continuous integration for database changes Build, testing and deployment of DB changes carried out through a proper database release process Fully automated tests Production system is monitored for fast feedback to developers   Does this model reflect your team at all? Where are you on this journey? We’d be very interested in knowing how you get on. We’re doing a lot of work at the moment, at Red Gate, trying to help people progress through these stages. For example, if you’re currently not source controlling your database, then this is a natural next step. If you are already source controlling your database, what about the next stage – continuous integration and automated release management? To help understand these issues, there’s a summary of the Red Gate Database Delivery learning program on our site, alongside a Patterns and Practices library here on Simple-Talk and a Training Academy section on our documentation site to help you get up and running with the tools you need to progress. All feedback is welcome and it would be great to hear where you find yourself on this journey! This article is part of our database delivery patterns & practices series on Simple Talk. Find more articles for version control, automated testing, continuous integration & deployment.

    Read the article

  • Problem with all fn Keys for Thinkpad W520

    - by Ludwig
    I have a problem with the fn. I used a live version of Ubuntu 11.10 on a USB stick to install Ubuntu on my Thinkpad W520. The fn keys worked perfectly in the live version, but do not work in the installed version. I would like to know what I have messed up. I can't seem to find the fix on google, except for inserting the shortcuts by hand, which seems a little brutal, since I know they can work without prior configuration. Thanks in Advance. Edit: The keys just started working out of the blue. I don't really know what my system did or what maybe I did. However all keys work fine now. Sorry for putting this now useless question here.

    Read the article

  • SOA & BPM Specialized Partners Only! New Service to Promote Your SOA & BPM Events at oracle.com/events

    - by JuergenKress
    The Partner Event Publisher has just been made available to all SOA & BPM specialized partners in EMEA. Partners now have the opportunity to publish their events to the Oracle.com/events site and spread the word on their upcoming live in-person and/or live webcast events. See the demo below and click here to read more information. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: Specialization,marketing services,oracle events,SOA Community,Oracle SOA,Oracle BPM,BPM Community,OPN,Jürgen Kress

    Read the article

  • Dual booting Ubuntu 12.04: UEFI and Legacy

    - by cmhughes
    I'm trying to dual boot Ubuntu 12.04 (or 12.10) with Windows 8 on a new Sony Vaio, but have run into some problems :) Specifically, my problems seem to come from choosing UEFI or Legacy as the Bootmode in the BIOS. Here is what I have found so far: Windows 8 needs to boot using UEFI, and doesn't work in Legacy mode Ubuntu (both 12.04 and 12.10) needs to boot using Legacy, and won't boot (at least from the live disk) in UEFI mode I have been able to boot Ubuntu using a live USB disc, provided that I change the Bootmode to Legacy. I haven't committed to installing it yet, because I don't really understand the consequences. My main concerns are that instead of simply selecting Windows or Ubuntu in Grub, I would also have to change my Bootmode every single time, which seems like a lot more trouble than it should be. So, the question: how can I install Ubuntu 12.04 or 12.10 in UEFI boot mode?

    Read the article

  • Why is it necessary to install EFI/rEFInd/UEFI/... on a SD Card since the Macbook Pro seems to already have it?

    - by user170794
    Dear askubuntu members, I own a Macbook Pro (late 2009) and when I boot the laptop and hold the alt key meanwhile, there is a EFI screen, so EFI is installed on... the firmware? I had a few troubles with my hard disk, so I had to change it, but I haven't installed OS X, I have only installed Ubuntu and still the EFI screen is there which is surely a good thing. As the new hard disk is making troubles again, I am using Puppy Linux, booting from a CD each time, which is unconfortable. So I am trying to have Ubuntu installed on a SD Card. After having spent many months on the internet grabing informations anywhere I can and trying several things, I applied this method: http://www.weihermueller.de/mac/ I succeeded in making one SD Card recognizable by the EFI of my laptop (holding alt key @ boot), but nothing installed on it yet as I fear to lose the recognizable-by-EFI part. I haven't succeded in producing the same result on another SD Card. I have a bootable USB key of Ubuntu (yipee) which works like a live CD, made with the help of Universal Linux UDF Creator, found there: http://www.pendrivelinux.com/universal-usb-installer-easy-as-1-2-3/ on which I have put Ubuntu 13.04 64bit, retrieved from the official deposits. Eventhough I have to add the "nouveau.noaccel=1" option to the grub command line launching Linux, it works (yipee again) properly as a live cd. When installing Ubuntu I come across the "where do I wanna put Ubuntu" window, I partition another SD Card in: the EFI part (40MB) the Linux part (15GB< <16GB) The installation works fine and finishes with no problem. But at the reboot, the SD Card where Linux is installed is not recognized by the EFI, the icons are : the CD (Puppy Linux), the USB stick (from Linux UDF Creator), the hard drive (the formerly-working Ubuntu 12) but no fourth icon of the SD Card whatsoever. As the title of this thread suggests, I am wondering: why there is a need for EFI to be installed on the SD Card since EFI seems to be on my laptop anyway? why EFI has to be on a different partition than the Linux's one? How do both parts communicate? why the EFI part on the SD Card made with the help of the live-USB key isn't recognized? on the EFI partition, there is a folder named "EFI" which contains another folder named "ubuntu" which contains a file named "grubx64.efi", why is there a thing called grub? Is it the Linux's grub where one can chose either to boot, to boot in safe mode, etc.? Thank you for your patience, looking forward for any kind of answer, Julien

    Read the article

< Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >