Search Results

Search found 61239 results on 2450 pages for 'application management suite for siebel'.

Page 94/2450 | < Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >

  • SQL Server Management Studio not scripting all objects

    - by Ian Boyd
    i've been attempting to script a database using SQL Server 2005 Management Studio. i cannot get it to script some objects. It scripts others, but skips some. i can provide detailed screen shots the options being selected including all tables the folder where the script files will go the folder being empty before scripting the scripting process saying Sucess when scripting a table the destination folder no longer empty, with a hundred or so script files the script of some tables not being in the folder. And earlier SSMS would not script some views. Is this a known thing that the the Generate Scripts task does not generate scripts? Update Known issue on Microsoft Connect, but Microsoft couldn't repro the steps, so they closed closed the ticket. Fails on SQL Server 2005, also fails on SQL Server 2008. Update Two Some basic questions: 1.What version of SQL Server? Microsoft SQL Server 2000 - 8.00.194 (Intel X86) Microsoft SQL Server 2005 - 9.00.3042.00 (Intel X86) Microsoft SQL Server 2008 - 10.0.2531.0 (Intel X86) Microsoft SQL Server 2005 Management Studio: 9.00.4035.00 Microsoft SQL Server 2008 Management Studio: 10.0.1600.22 2.What O/S are you running on? Windows Server 2000 Windows Server 2003 Windows Server 2008 3.How are you logging in to SQL server? sa/password Trusted authentication 4.Have you verified your account has full access to all objects? Yes, i have access to all objects. 5.Can you use the objects that fail to script? (eg: select top(10) * from nonScriptingTable) Yes, all objects work fine. SQL Server Enterprise Manager can script the objects fine. Update Three They fail no matter what version of SQL Server you script against. It wasn't a problem in Enterprise Manager: Client Tools SQL Server 2000 SQL Server 2005 SQL Server 2008 ============ =============== =============== =============== 2000 Yes n/a n/a 2005 No No No 2008 No No No Update Four No errors found in the database using: DBCC CHECKDB go DBCC CHECKCONSTRAINTS go DBCC CHECKFILEGROUP go DBCC CHECKIDENT go DBCC CHECKCATALOG go EXECUTE sp_msforeachtable 'DBCC CHECKTABLE (''?'')' Honk if you hate SSMS.

    Read the article

  • Datacenter IP Addressing and DNS Management

    - by user65248
    Hello everyone Basically we are setting up a small Datacenter, about 300 amps power and max 50 racks, Im saying these coz I wanna u imagine the size and requirements, I have studied networking mostly Microsoft and Windows based systems , but I cant get how the IP addressing and DNS management and configuration works in a Datacenter , and unfortunately I have to setup everything by myself but defe we will have some staff to do some job. Now my questions Datacenter IP Addressing Suppose we have got a block of 200 IP addresses from our ISP, How can I manage these block of IP addresses, is there any software out there to simplify this I heard that using DHCP server in a datacenter is not recommended, otherwise what would u say about MS DHCL server ofc considering we need to have backup serversin case of failur How can I assign a block of IPs to a specific rack, I know with different software and management its different but Im asking how it is done normally IP addresses are exposed to the whole network, what if a customer try to use an IP address and is not assigned to their server or rack , how can I prevent this or how can I track the IP usage DNS Management Im goin to setup at least two servers for our DNS servers, I know nothing about Datacenter DNS system, but I have configured DNS server in normal networks and also for webservers, Now I wanna know What exactly needs to be done for a DNS in a datacenter that is not done for normal networks. How can I configure PTR records why cant I configure PTR records on my webserver side DNS server and it should be done on datacenter DNS server , I mean what is the difference in DC DNS servers that allow us to to so , I know the question is very silly and simple but Im confused Is there any software outthere to allow doing the whole thing, I mean automatically add records to the DNS and also managin IP addresses !? Thanks in advance

    Read the article

  • iis 7.5 - WFF and ARR farm management

    - by smackaysmith
    We have two test web farms (IIS 7.5). The Florida web farm has two ARR servers and two content servers. The ARR servers have WFF and NLB installed. The ARR setup uses a shared config located on a file share. The content servers do not have WFF installed. There is one web farm, and it's managed on an ARR server. The Illinois web farm also has two ARR servers and two content servers. ARR servers have WFF and NLB installed, and they use a shared config located on a share. One of the content servers has WFF installed, which makes it the controller; it's also the primary content server. Apparently, Illinois isn't properly configured. From what we've pieced together from various IIS.net articles and this post (http://ruslany.net/2010/07/web-farm-framework-2-0-overview/), the controller should be one of the ARR servers (like our Florida setup). The thing is Florida's controller doesn't have a Primary server nor can you set one of the content servers as Primary. It doesn't have the management piece showing the Trace messages when you click the Servers node (from iis console, Server Farms/FLFarm/Servers http://ruslany.net/wp-content/uploads/2010/07/WebFarm8.png). That management piece does exist in the Illinois farm, but that's a bad configuration. What are we missing that our Florida configuration doesn't have the Primary and Secondary content servers, and the management piece? I have looked for IIS role differences, but there are none.

    Read the article

  • Application that will identify percentage of your system disk bandwidth used on a user-application by user-application basis?

    - by Warren P
    I always (subjectively) feel my computer is far too slow (however fast it is), and so I'm always looking for ways to measure and understand what my computer is actually doing, that is making it seem "slow" to me. It has been my observation that my software-developer workload is most often disk-bound (I am waiting for Disk I/O) more than CPU bound. What has made it worse, is that I am using a corporate PC that has in-memory active-scanning anti-virus software that I do not have control over, and also some IT department mandated services that seem to suck up a lot of available hard-disk bandwidth. The best tool I have seen (in Windows 7) is the Resource Monitor which I usually acess from the button in the task Manager. The disk IO page, however, seems to label Disk Activity at a very low level (for example, showing the Volume Shadow Storage, which is flushing information obviously written by something ELSE other than VSS itself, and then writes to Pagefile.sys, which are obviously due to Virtual Memory faults in some application). What I would like to know is if a utility exists that can add up all direct disk input and output by user-level process, or find the process or service that caused VM or VSS activity. In that way, I hope, you could establish a real idea of how much of your computer's precious disk subsystem bandwidth is attributable to a particular application. here's a scenario: MyApp.exe writes 100k/s and reads 100k/s directly. VSS ends up writing another 100k/s. pagefaults caused inside MyApp.exe cause another 100k/s of writes. So the total "cost" of MyApp.exe running, during a period of time (let's say 1 second) is 400k/s, whereas you can only directly observe half of that, in Resource Monitor. Is there a smarter disk-IO watching piece of software I can use?

    Read the article

  • Deploy EAR with Websphere Application Server wsadmin.bat without losing security role-mapping?

    - by Tommy
    We're running CI towards our WAS with wsadmin.bat The applications are updated with this command $AdminApp update ${projectName}EAR app {-operation update -update.ignore.new -contents {${artifactsDir}/${projectName}-${buildVersion}.ear}} This causes all the "Security role to user/group mapping"-settings to reset, even though all the other settings are preserved with the -update.ignore.new Anyone know how to fix this?

    Read the article

  • How can one select an application to switch to in task switcher with track pad (MacBook Pro, Mac OS

    - by index
    Three finger swipe left/right - Task switcher shows up (a horizontal bar with app symbols). Two finger swipe - highlights an icon and shows app name. ?? - switches to highlighted app. Since the highlighting is independent of the mouse cursor, a click doesn't work here without moving the mouse cursor. But I don't want this (locate mouse cursor, move the extra mile..). Maybe Better Touch Tool allows to configure a click to act as SPACE or RETURN in the task switcher? But how did Apple intend to use this?

    Read the article

  • Deploy EAR with Websphere Application Server wsadmin.bat without loosing security role-mapping?

    - by Tommy
    We're running CI towards our WAS with wsadmin.bat The applications are updated with this command $AdminApp update ${projectName}EAR app {-operation update -update.ignore.new -contents {${artifactsDir}/${projectName}-${buildVersion}.ear}} This causes all the "Security role to user/group mapping"-settings to reset, even though all the other settings are preserved with the -update.ignore.new Anyone know how to fix this?

    Read the article

  • IIS 7 Application Pools using a different amount of memory on multiple servers behind a load balancer

    - by Jim March
    We have 6 servers in a web farm behind an F5. There are approximately 25 AppPools on each of these servers. On servers 1 - 5 the apppools are consuming approx 500MB Private Bytes, and 5GB Virtual Bytes. On server 6 the apppools are consuming approx 800MB Private Bytes, and 8GB Virtual Bytes. I can not seem to figure out why we have this difference. The code is the exact same on each box. We replicate the apphost.config between the boxes, so the Appplication Configs are identical. The only difference seems to be that this box consumes more RAM, and in turn ends up using a lot more CPU. During Black Friday we observed the CPU on server 6 spiking to 100% and noticed that the % Memory Commit was also near 100%, while the rest of the farm was at closer to 50% utilization. Pulling the 6th server from the load balancer dropped CPU/Memory on the 6th server back to normal, and did not cause a noticeable strain on the other servers.

    Read the article

  • A process serving application pool 'X' reported a failure. The process id was 'Y'. The data field c

    - by born to hula
    I have a WCF Web Service which is kept under an Application Pool on IIS. Lately I've been getting "Service Unavaiable" when I'm trying to make calls to this Web Service. The first thing I tried to do was restarting the Application Pool. I did it and after a couple of seconds, it crashed and stopped. Looking at the Event Viewer, I found these messages, which by the moment couldn't help me to find where the problem is. A process serving application pool 'X' reported a failure. The process id was '11616'. The data field contains the error number. For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp. After getting a couple of these, I got this one: Application pool 'X' is being automatically disabled due to a series of failures in the process(es) serving that application pool. For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp. I've already checked permissions and Application Pool configurations but everything seems to be OK. Have anyone been through this? Thanks in advance.

    Read the article

  • SQLAuthority News – Three Posts on Reporting – T-SQL Tuesday #005

    - by pinaldave
    If you are following my blog, you already know that I am more of “T-SQL and Performance Tuning” type of person. I do have a good understanding of Business Intelligence suit and I also do certain training sessions on the same subject. When I was writing the blog post for T-SQL Tuesday #005 – Reporting, I realized that I have written a post that clearly explains how to generate reports using SQL Server Management Studio. Here is a quick recap on how one can use SSMS and out-of-the-box reports which can help many developers. Please note that they can be resource-intensive as well, so please use SSMS carefully. SQL SERVER – Generate Report for Index Physical Statistics – SSMS SQL SERVER – Out of the Box – Activity and Performance Reports from SSSMS SQL SERVER – Configure Management Data Collection in Quick Steps – T-SQL Tuesday #005 Junior developers and DBA can use these reports right away and can also start learning and exploring most database performance issues with the help of Sr. DBAs. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology Tagged: SQL Reporting, SQL Reports

    Read the article

  • Amit Jasuja's Session at Gartner IAM with Ranjan Jain of Cisco

    - by Naresh Persaud
    If you did not get a chance to attend Amit Jasuja's session at Gartner IAM this week in Las Vegas, here is a summary of the session and a copy of the slides. The agenda featured an introduction by Ray Wagner, Managing VP at Gartner, followed by Amit discussing the trends in Identity and Access Management shaping Oracle's strategy. Today we are seeing the largest re-architecture in a decade. Every business from manufacturing to retail is transforming the way they do business. Manufacturing companies are becoming manufacturing services companies. Retail organizations are embracing social retail. Healthcare is being delivered on-line around the clock. Identity Management is at the center of the transformation. Whether you are Toyota embracing a social network for cars or launching the next Iphone, the Identity of the user provides context to enable the interaction and secure the experience. All of these require greater attention to the context of the user and externalizing applications for customers and employees.  Ranjan discussed how Cisco is transforming  by integrating 1800 applications to a single access management framework and consolidating 3M users across 4 data centers to support internal and external processes. David Lee demonstrated how to use Oracle Access Manager 11g R2 on a mobile application to sign-on across multiple applications while connecting mobile applications to a single access control policy.

    Read the article

  • SQL SERVER – Copy Column Headers from Resultset – SQL in Sixty Seconds #027 – Video

    - by pinaldave
    SQL Server Management Studio returns results in Grid View, Text View and to the file. When we copy results from Grid View to Excel there is a common complaint that the column  header displayed in resultset is not copied to the Excel. I often spend time in performance tuning databases and I run many DMV’s in SSMS to get a quick view of the server. In my case it is almost certain that I need all the time column headers when I copy my data to excel or any other place. SQL Server Management Studio have two different ways to do this. Method 1: Ad-hoc When result is rendered you can right click on the resultset and click on Copy Header. This will copy the headers along with the resultset. Additionally, you can use the shortcut key CTRL+SHIFT+C for coping column headers along with the resultset. Method 2: Option Setting at SSMS level This is SSMS level settings and I kept this option always selected as I often need the column headers when I select the resultset. Go Tools >> Options >> Query Results >> SQL Server >> Results to Grid >> Check the Box “Include column header when copying or saving the results.” Both of the methods are discussed in following SQL in Sixty Seconds Video. Here is the code used in the video. Related Tips in SQL in Sixty Seconds: Copy Column Headers in Query Analyzers in Result Set Getting Columns Headers without Result Data – SET FMTONLY ON If we like your idea we promise to share with you educational material. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • SQL SERVER – Copy Column Headers from Resultset – SQL in Sixty Seconds #026 – Video

    - by pinaldave
    SQL Server Management Studio returns results in Grid View, Text View and to the file. When we copy results from Grid View to Excel there is a common complaint that the column  header displayed in resultset is not copied to the Excel. I often spend time in performance tuning databases and I run many DMV’s in SSMS to get a quick view of the server. In my case it is almost certain that I need all the time column headers when I copy my data to excel or any other place. SQL Server Management Studio have two different ways to do this. Method 1: Ad-hoc When result is rendered you can right click on the resultset and click on Copy Header. This will copy the headers along with the resultset. Additionally, you can use the shortcut key CTRL+SHIFT+C for coping column headers along with the resultset. Method 2: Option Setting at SSMS level This is SSMS level settings and I kept this option always selected as I often need the column headers when I select the resultset. Go Tools >> Options >> Query Results >> SQL Server >> Results to Grid >> Check the Box “Include column header when copying or saving the results.” Both of the methods are discussed in following SQL in Sixty Seconds Video. Here is the code used in the video. Related Tips in SQL in Sixty Seconds: Copy Column Headers in Query Analyzers in Result Set Getting Columns Headers without Result Data – SET FMTONLY ON If we like your idea we promise to share with you educational material. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • CSO Summit @ Executive Edge

    - by Naresh Persaud
    If you are attending the Executive Edge at Open World be sure to check out the sessions at the Chief Security Officer Summit. Former Sr. Counsel for the National Security Agency, Joel Brenner ,  will be speaking about his new book "America the Vulnerable". In addition, PWC will present a panel discussion on "Crisis Management to Business Advantage: Security Leadership". See below for the complete agenda. TUESDAY, October 2, 2012 Chief Security Officer Summit Welcome Dave Profozich, Group Vice President, Oracle 10:00 a.m.–10:15 a.m. America the Vulnerable Joel Brenner, former Senior Counsel, National Security Agency 10:15 a.m.–11:00 a.m. The Threats are Outside, the Risks are Inside Sonny Singh, Senior Vice President, Oracle 11:00 a.m.–11:20 a.m. From Crisis Management to Business Advantage: Security Leadership Moderator: David Burg, Partner, Forensic Technology Solutions, PwC Panelists: Charles Beard, CIO and GM of Cyber Security, SAIC Jim Doggett, Chief Information Technology Risk Officer, Kaiser Permanente Chris Gavin, Vice President, Information Security, Oracle John Woods, Partner, Hunton & Williams 11:20 a.m.–12:20 p.m. Lunch Union Square Tent 12:20 p.m.–1:30 p.m. Securing the New Digital Experience Amit Jasuja, Senior Vice President, Identity Management and Security, Oracle 1:30 p.m.–2:00 p.m. Securing Data at the Source Vipin Samar, Vice President, Database Security, Oracle 2:00 p.m.–2:30 p.m. Security from the Chairman’s Perspective Jeff Henley, Chairman of the Board, Oracle Dave Profozich, Group Vice President, Oracle 2:30 p.m.–3:00 p.m.

    Read the article

  • Security Newsletter November Edition is Out

    - by Tanu Sood
    The November edition of the Security Inside Out Newsletter is now out. This month’s newsletter captures the highlights from Oracle OpenWorld. The conference registration broken all the past records and so did all Security related events and activities at OpenWorld. From Security keynotes, conference sessions, hands-on-labs, product demonstrations to the very successful Executive Edge @ Openworld: Chief Security Officer Summit. The main feature discuses the key topics and trends compiled from across all the Security related sessions. The newsletter also features an interview with Amit Jasuja, Senior Vice President, Security and Identity Management at Oracle. Amit discusses the key trends in the industry and how these have helped shape innovation in the latest release of Oracle Identity Management solution set. If you are looking at cloud, social and mobile and are concerned about security, you don’t want to miss this feature. As always, the newsletter captures both recent and upcoming Security and Identity Management events, conferences, training, news and more. So, if you haven’t done so, we recommend you subscribe to the Security Inside Out Newsletter today. We’d love to hear from you. Let us know some topics you’d like to see covered in the upcoming editions. Or just let us know how we are doing. We look forward to hearing from you.

    Read the article

  • Only One Month to OpenWorld-San Francisco!

    - by Stephen Slade
    From around the world, the city is expecting 50,000+ guests to flock to this annual extravaganza.  Over 2,000 sessions will focus on Oracle’s latest product offerings, customer case studies, panels of experts and a variety of other hardware, technology, middleware and applications. For those interested  in the latest capabilities delivered by Oracle’s supply chain applications, the ‘Focus-On’ documents are now avaiable to help guide you in your schedule builder. Schedule builder allows the capability to create a personalized agenda for the sessions you wish to attend, such as: Monday October 1, 2012 TIME TITLE LOCATION  3:15 pm –4:15 pm General Session: Supply Chain Management—Strategy, Update, and Roadmap Richard Jewell, Senior Vice President, Applications Development, Oracle Moscone West Level 2 Room 3014 Tuesday October 2, 2012 TIME TITLE LOCATION  10:15 am –11:15 am Oracle Fusion Supply Chain Management: Overview, Strategy, Customer Experiences, and Roadmap Jon Chorley, CSO & VP, Product Strategy, Oracle Moscone West  Level 2 Room 2006 There is an exciting lineup of about 100 supply chain sessions at OpenWorld. Contact your sales rep or Oracle Partner to obtain a copy of the most current Focus-On document, segmented by pillars such as Manufacturing, Maintenance/EAM, Value Chain Planning, Value Chain Execution, Procurement and Agile/Product Lifecycle Management.  They will provide you with a better informed view to schedule your time in San Francisco.

    Read the article

  • It's The End of Work as We Know It, But I Feel Fine

    - by Naresh Persaud
    If you are attending Open World this year, don't miss Amit Jasuja's session on trends in Identity Management. This session will take place on Monday October 1st in Moscone West at 10:45. You can join the conversation on Twitter as Amit Jasuja discusses the trends that are shaping Identity Management as a market and how Oracle is responding to these secular trends. Use hashtag OracleIDM. In addition, here’s a list of the sessions in the  Identity Management  track. In Amit's session, he will discuss how the workplace is changing. The pace of technology is accelerating and work is no longer a place but rather an activity. We are behaving socially in our professional lives and our professional responsibilities are encroaching on our social lives.  The net result is that we will need to change the way we work and collaborate. Work is anytime and anywhere. This impacts the dynamics of teams and how they access information and applications. Our teams span multiple organizations and "the new work order" means enabling the interaction and securing the experience. It is the end of work as we know it both economically and technologically. Join Amit for this session and you will feel much better about the changing workplace. 

    Read the article

  • How to handle growing QA reporting requirements?

    - by Phillip Jackson
    Some Background: Our company is growing very quickly - in 3 years we've tripled in size and there are no signs of stopping any time soon. Our marketing department has expanded and our IT requirements have as well. When I first arrived everything was managed in Dreamweaver and Excel spreadsheets and we've worked hard to implement bug tracking, version control, continuous integration, and multi-stage deployment. It's been a long hard road, and now we need to get more organized. The Problem at Hand: Management would like to track, per-developer, who is generating the most issues at the QA stage (post unit testing, regression, and post-production issues specifically). This becomes a fine balance because many issues can't be reported granularly (e.g. per-url or per-"page") but yet that's how Management would like reporting to be broken down. Further, severity has to be taken into account. We have drafted standards for each of these areas specific to our environment. Developers don't want to be nicked for 100+ instances of an issue if it was a problem with an include or inheritance... I had a suggestion to "score" bugs based on severity... but nobody likes that. We can't enter issues for every individual module affected by a global issue. [UPDATED] The Actual Questions: How do medium sized businesses and code shops handle bug tracking, reporting, and providing useful metrics to management? What kinds of KPIs are better metrics for employee performance? What is the most common way to provide per-developer reporting as far as time-to-close, reopens, etc.? Do large enterprises ignore the efforts of the individuals and rather focus on the team? Some other questions: Is this too granular of reporting? Is this considered 'blame culture'? If you were the developer working in this environment, what would you define as a measureable goal for this year to track your progress, with the reward of achieving the goal a bonus?

    Read the article

  • A Year of Upheaval for Procurement Professionals-New Report & Webinar

    - by DanAshton
    2013 will see significant changes in priorities and initiatives among procurement professionals as they balance the needs of their enterprises with efforts to add capabilities for long-term procurement success. In response, procurement managers will expand their organization’s spend influence via supplier relationship management, sourcing, and category management. These findings are part of the new report, “2013 Procurement Key Issues: Going Deeper and Broader to Deliver Borderless Procurement Services,” by the Hackett Group. The authors say that compared to similar studies over the last five years, 2013 is registering the greatest year-over-year changes in priorities for both procurement performance and capability issues. Three Important PrioritiesThe survey found that procurement professionals are focusing their attention in three key areas. Cost reduction. Controlling expenses is always a high priority, but with 90 percent of the respondents now placing this at the top of their performance concerns, the Hackett analysts say this “clearly shows that, for better or worse, cost reduction is king” in 2013. Technology innovation. Innovation has shot up significantly in the priority rankings and is now tied with spend influence for second among procurement professionals. Sixty-five percent of the survey participants said pursuing game-changing innovation and technology is a top procurement initiative. Managing supply risk. This area registered a sharp rise in importance because of its role in protecting profits, Hackett says. Supplier compliance with performance milestones and regulatory requirements is receiving particular attention, with an emphasis on efficient management of cross-functional workflows. “These processes create headaches for suppliers and buyers alike, and can detract from strategic value creation when participants are bogged down in processing paper and spreadsheets,” the report explains.  For more insights into the current state of the procurement industry, download the full report, “2013 Procurement Key Issues: Going Deeper and Broader to Deliver Borderless Procurement Services” and watch a Webcast featuring Global Procurement Advisory Practice Leader for The Hackett Group, Chis Sawchuk, and Managing Supervisor of Supply Chain Processes and Systems for Ameren, Chris Nelms. 

    Read the article

  • What is the role of traditional issue tracker when Scrum / Kanban board is used?

    - by Borek
    From a very high level view, to me it seems there are generally 2 types of Project Management tools: Traditional issue trackers like Fogbugz, JIRA, BugZilla, Trac, Redmine etc. Virtual card boards / agile project management tools like Pivotal Tracker, GreenHopper, AgileZen, Trello etc. Sure, they overlap in one way or another, e.g. Pivotal Tracker tasks can be imported to JIRA, GreenHopper itself is implemented on top of JIRA issue base etc. but I think one can still see the difference in orientation between those two types of tools. Traditional issue tracker seems to be used even in companies otherwise doing agile project management. My question is, why do they do that? I also feel that we should use an issue tracker in my company but when I'm thinking about it, I'm not actually sure why should we need it. For example, Trello development seems to be managed by using Trello itself (see this virtual wall) even though they have access to Fogbugz, one of the best issue trackers around. So maybe we don't need traditional issue tracker when we'll be doing 100% of our work in an agile manner using one of the agile PM tools?

    Read the article

  • Where is Oracle Utilities Application Framework V3?

    - by Anthony Shorten
    You may of noticed that the latest version of the Oracle Utilities Application Framework is V4.0.1. The last release of the Framework was V2.2. So what happened to V3? The short answer is that there is no V3 of the framework. The long answer is that the Oracle Utilities Application Framework has long been associated with Oracle Utilities Customer Care And Billing and Oracle Enterprise Taxation Management only. As more and more of the Oracle Tax And Utilities products are migrated onto the framework the association betweent eh original products on the framework is less appropriate. Therefore it was decided to pick a version number to emphasize the decouplinf of the releases of the Framework with any particular product. To illustrate this, the Oracle Mobile Workforce Management (MWM) V2.0.0 product uses Oracle Utilities Applicaton Framework V4.0.1. If we used the old numberings schema then MWM would be V4.0.1 which makes no sense, given the last release of MWM was V1.x The framework has its own development team and product management. It basicaly has its own schedule (though it is influenced by the products that use it still - which makes sense). So that s the reasoning around the version numbering change for the framework.

    Read the article

  • Notification framework for object lifecycle

    - by rlandster
    I am looking for an application, framework, or library that would help us with "object life-cycle management". There are many things that are created for users, departments, and services that, all too often, are left unmanaged. Some examples: user accounts groups SSL certificates access rights databases software license provisionings storage list-serve accounts These objects are created and managed by a wide variety of applications and systems. Typically, a user (person) requests (either explicitly or implicitly) one of these objects. A centralized management tool would help us manage such administration chores as: What objects does user X currently own/manage? Move the ownership of object P to user X; move all objects owned by user X (who was just been fired) to user Y. For all objects of type T that have expired be sure the objects have been disabled or deleted by their provider. How many active (expired, about-to-expire) objects of type P are there? Send periodic notifications to all users who own active objects of type P reminding them of what they own. There is a security alert for objects of type P; send a notification to all users who own these types of objects to take a specific remedial action. Delete or disable a set of objects based on expiration (or some other criteria). These objects are directly managed through their own applications (Active Directory, MySql, file systems, etc.) and may even have their own notification systems, but I want to centralize this into an "object management system". The OMS should allow the association with an external identity provider that defines who the users and groups are (e.g., LDAP, Active Directory) creation of objects association of an object to a specific user and/or group association with an expiration date creation of flexible reporting including letting users know what objects they currently own and their expiration dates integration with an external object "provider" via a plug-in We could write something from scratch, but I am hoping there is something already out there that will help, either an entire application or a set of libraries that provide much of what is needed. Any ideas?

    Read the article

  • Oracle OpenWorld 2012 Hands-on Lab: “Leading Your Everyday Application Integration Projects with Enterprise SOA”

    - by Lionel Dubreuil
    Sharpen your Oracle skill sets and master Oracle technology in Oracle OpenWorld Hands-on Labs.In self-paced, practical learning sessions covering everything from business applications to middleware, database, storage, and enterprise management solutions, you'll discover new ways to derive maximum benefits from your Oracle hardware and software solutionsOracle experts will be available in person to answer questions and guide you through each lab.Hands-on Labs fill up early, and seats are limited, so don’t be late.This  HOL10093 - Leading Your Everyday Application Integration Projects with Enterprise SOA is scheduled for: Date: Monday, Oct 1 Time: 10:45 AM - 11:45 AM Location: Marriott Marquis - Salon 5/6 In this Hands-on Lab, Experience firsthand how Oracle Enterprise Repository, Oracle Application Integration Architecture (AIA) Foundation Pack, and Oracle SOA Suite work together to help you drive your enterprisewide integration projects.From asset management, discovery, and management in Oracle Enterprise Repository to integration of content in Oracle AIA Foundation Pack operating on the Oracle SOA Suite platform, discover how you can develop integrations to support business agility.Take advantage of Oracle-delivered integration assets and validate your services for compliance, within Oracle JDeveloper. You will get your hands on the tools and talk with Oracle experts in this hands-on lab.Objectives for this session are to: Use Oracle Enterprise Repository to manage application interfaces, composite applications, and business processes See how Oracle Enterprise Repository can benefit every service-based application integration project Learn how to govern services through the software lifecycle and validate your services for compliance

    Read the article

< Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >