Search Results

Search found 4248 results on 170 pages for 'fidelity investments life insurance'.

Page 25/170 | < Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >

  • Certify September Updates

    - by Sadia2
    We have added some release and platform certifications to MOS Certify. Applications: Oracle Demantra 12.2.2, 7.3.1.5, 7.3.1.4, 7.3.0.2.0, 7.3.0.0.0 Collaboration Technologies: Oracle Beehive 2.0.1.8.0 Database: Oracle Database Client 12.1.0.1.0, Oracle Clusterware 11.2.0.4.0, Oracle Database 11.2.0.4.0, Oracle Real Application Clusters 11.2.0.4.0 E-Business Suite: Oracle E-Business Suite 12.2.2, 12.1.3, 12.1.2, 12.1.1, 12.0.6, 11.5.10.2 Edge Applications: Oracle AutoVue 20.2.2, 20.2.1, 20.2.0 Enterprise Manager: Enterprise Manager Base Platform - OMS 12.1.0.3.0, Oracle Real User Experience Insight 12.1.0.4.0, 12.1.0.3.0, 12.1.0.1, 11.1 FSGBU Insurance Group: Oracle Health Insurance Claims 2.13.3.0.0 Fusion Middleware: Oracle Business Intelligence Applications 11.1.1.7.1, 7.9.6.4.0, Oracle Discoverer 11.1.1.6.0, Discoverer Administrator 11.1.1.6.0, Discoverer Desktop 11.1.1.6.0, Oracle JDK 1.7.0_40, 1.7.0_25", Oracle JRE 1.7.0_40, 1.7.0_25, Oracle JRockit 6u45 R28.2.7+, Oracle WebCenter Sites 11.1.1.8.0, Oracle WebCenter Sites: Community-Gadgets 11.1.1.8.0, Oracle WebCenter Sites: CIP for File Systems and MS SharePoint 11.1.1.8.0, Oracle WebCenter Sites: CIP for EMC Documentum 11.1.1.8.0 JD Edwards EnterpriseOne: JD Edwards EnterpriseOne Business Services Server 9.1.3.0, 9.1.2.0, 9.1.0.0, JD Edwards EnterpriseOne Mobile Applications 9.1.2.0 Oracle Fusion Applications: Oracle Fusion Applications 11.1.7.0.0 Primavera GBU: Primavera Unifier 9.13.0.0 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Siebel Enterprise: Siebel Application Server 8.2.2.4.0, 8.2.2.3.0, 8.2.2.2.0, 8.1.1.10.0, 8.1.1.9.0, Siebel Database Server 8.2.2.3.0, 8.1.1.10.0, 8.1.1.9.0, Siebel Remote Client 8.2.2.4.0, 8.2.2.3.0, 8.2.2.2.0, 8.1.1.11.0, 8.1.1.10.0, 8.1.1.9.0, Siebel Tools Client 8.2.2.4.0, 8.2.2.2.0, 8.1.1.11.0, 8.1.1.9.0, Siebel SSO Integration 8.2.2.4.0, 8.2.2.3.0, 8.2.2.2.0, 8.1.1.11.0, 8.1.1.10.0, 8.1.1.9.0

    Read the article

  • ARTS Reference Model for Retail

    - by Sanjeev Sharma
    Consider a hypothetical scenario where you have been tasked to set up retail operations for a electronic goods or daily consumables or a luxury brand etc. It is very likely you will be faced with the following questions: What are the essential business capabilities that you must have in place?  What are the essential business activities under-pinning each of the business capabilities, identified in Step 1? What are the set of steps that you need to perform to execute each of the business activities, identified in Step 2? Answers to the above will drive your investments in software and hardware to enable the core retail operations. More importantly, the choices you make in responding to the above questions will several implications in the short-run and in the long-run. In the short-term, you will incur the time and cost of defining your technology requirements, procuring the software/hardware components and getting them up and running. In the long-term, as you grow in operations organically or through M&A, partnerships and franchiser business models  you will invariably need to make more technology investments to manage the greater complexity (scale and scope) of business operations.  "As new software applications, such as time & attendance, labor scheduling, and POS transactions, just to mention a few, are introduced into the store environment, it takes a disproportionate amount of time and effort to integrate them with existing store applications. These integration projects can add up to 50 percent to the time needed to implement a new software application and contribute significantly to the cost of the overall project, particularly if a systems integrator is called in. This has been the reality that all retailers have had to live with over the last two decades. The effect of the environment has not only been to increase costs, but also to limit retailers' ability to implement change and the speed with which they can do so." (excerpt taken from here) Now, one would think a lot of retailers would have already gone through the pain of finding answers to these questions, so why re-invent the wheel? Precisely so, a major effort began almost 17 years ago in the retail industry to make it less expensive and less difficult to deploy new technology in stores and at the retail enterprise level. This effort is called the Association for Retail Technology Standards (ARTS). Without standards such as those defined by ARTS, you would very likely end up experiencing the following: Increased Time and Cost due to resource wastage arising from re-inventing the wheel i.e. re-creating vanilla processes from scratch, and incurring, otherwise avoidable, mistakes and errors by ignoring experience of others Sub-optimal Process Efficiency due to narrow, isolated view of processes thereby ignoring process inter-dependencies i.e. optimizing parts but not the whole, and resulting in lack of transparency and inter-departmental finger-pointing Embracing ARTS standards as a blue-print for establishing or managing or streamlining your retail operations can benefit you in the following ways: Improved Time-to-Market from parity with industry best-practice processes e.g. ARTS, thus avoiding “reinventing the wheel” for common retail processes and focusing more on customizing processes for differentiations, and lowering integration complexity and risk with a standardized vocabulary for exchange between internal and external i.e. partner systems Lower Operating Costs by embracing the ARTS enterprise-wide process reference model for developing and streamlining retail operations holistically instead of a narrow, silo-ed view, and  procuring IT systems in compliance with ARTS thus avoiding IT budget marginalization While parity with industry standards such as ARTS business process model by itself does not create a differentiation, it does however provide a higher starting point for bridging the strategy-execution gap in setting up and improving retail operations.

    Read the article

  • Oracle Executive Strategy Brief: Enterprise-Grade Cloud Applications

    - by B Shashikumar
    Cloud Computing has clearly evolved into one of the dominant secular trends in the industry. Organizations are looking to the cloud to change how they buy and consume IT. And its no longer about just lower up-front costs. The cloud promises to deliver greater agility and free up resources to focus on innovation versus running and maintaining systems. But are organizations actually realizing these benefits? The full promise of cloud is not being realized by customers who entrust their business to multiple niche cloud providers. While almost 9 out of 10 companies  expect more IT agility with cloud, only 47% are actually getting it (Source: 2011 State of Cloud Survey by Symantec). These niche cloud customers have also seen the promises of lower costs, efficiency gains, improved security, and compliance go unfulfilled. Having one cloud provider for customer relationship management (CRM) and another for human capital management (HCM), and then trying to glue these proprietary systems together while integrating to a back-office financial system can add to complexity and long-term costs. Completing a business process or generating an integrated report is cumbersome, and leverages incomplete data. Why can’t niche cloud providers deliver on the full promise of cloud? It’s simple: you still need to complete business processes. You still need reporting that enables you to take action using data from multiple systems. You still have to comply with SOX and other industry regulations. These requirements don’t go away just because you deploy in the cloud. Delivering lower up-front costs by enabling customers to buy software as a service (SaaS) is the easy part. To get real value that lasts longer than your quarterly report, it’s important to realize the benefits of cloud without compromising on functionality and while having the right level of control and flexibility. This is the true promise of cloud. Oracle’s cloud strategy centers around delivering the benefits of cloud—without compromise. We uniquely empower our customers with complete solutions and choice. From the richest functionality to integrated reporting and great user experience. It’s all available in the cloud. And it works not just with other Oracle cloud applications, but with your existing Oracle and third-party systems as well. This helps protect your current investments and extend their value as you journey to the cloud. We’ve made the necessary investments not only in our applications but also in the underlying technology that makes it all run—from the platform down to the hardware and operating system. We make it all. And we’ve engineered it to work together and be highly optimized for our customers, in the cloud. With Oracle enterprise-grade cloud applications, you get the benefits of cloud plus more power, more choice, and more confidence. Read more about how you can realize the true advantage of Cloud with Oracle Enterprise-grade Cloud applications in the Oracle Executive Strategy Brief here.  You can also attend an Oracle Cloud Conference event at a city near you. Register here. 

    Read the article

  • I&rsquo;m sorry RPGs, it&rsquo;s not you, it&rsquo;s me: The birth of my game idea

    - by George Clingerman
    One of the things I’ve had to give up in order to have some development time at night is gaming. It’s something I refused to admit for years but I’ve just had to face the facts. I’m no longer a gamer. I just don’t have hours and hours of free time to pour into gaming and when I do have hours and hours of free time I want to pour them into game development. That doesn’t mean I don’t game at all! I play games pretty much every day. It just means I’ve moved more into the casual game realm. It’s all I have time for when juggling priorities in my life. That means that games like Gears of War 2 sit shrink wrapped on my shelf and although I popped Dragon Age into my Xbox 360 one time, I barely made it through the opening sequence and haven’t had time to sit down and play again. Instead I’m playing short games like Jamestown, Atom Zombie Smasher, Fortix or if I have time to jump in and play a few rounds maybe some Monday Night Combat or Team Fortress 2. These are games I can instantly get into and play for just a short period of time and then walk away. Breath of Death VII saved my life: Back in the day (way, way back in the day) I used to be a pretty big RPG fan. Not big by a lot of RPG gamers' standards (most of the RPGs RPG fans about I’ve never heard of) but I used to LOVE to play them on the NES, SNES and Genesis and considered that my genre. Final Fantasy, Shining in the Darkness, Bard’s Tale, Faxanadu, Shadowrun, Ultima, Dragon Warrior, Chrono Trigger, Phantasy Star, Shining Force and well the list could go on but those are the ones I remember off the top of my head. I loved playing RPGs and they were my games of choice. After my first son was born (this was just about 12 years ago), I tried to continue playing RPGs and purchased games like Baldur’s Gate I & II, Neverwinter Nights, Fable, then a few of the Final Fantasy’s then Kingdom Hearts. I kept buying these games and then only playing for about fifteen minutes and never getting back to them. I still loved RPGs but they just no longer fit into my life (I still haven’t accepted that since I still purchased Dragon Age II for some reason and convinced myself I’d find the time). Adding three more sons to the mix (that’s 4 total) didn’t help much to finding more RPG time (except for Breath of Death VII and other XBLIG RPG titles, thanks guys!) All work and no RPG: A few months ago as I was sitting thinking about the lack of RPGs in my life and talking to my wife about why I wish RPGs were different and easier for a dad like me to get into. She seemed like she was listening, so I started listing all the things that made them impossible for me to play. Here’s a short list I came up with. They take 15 billion hours to complete I have a few minutes at a time I can grab to play them if I want to have time to code. At that rate it would take me 9 trillion years to beat just one RPG. There’s such long spans of times between when I can play them I forget what I was even doing so I have to spend most of the playtime I have just figuring that out and then my play time is over. Repeat. I’ll never finish one and since it takes so long to get to the fun part in an RPG, I’m never having fun. RPGs aren’t fun if you don’t have hours to play them at a time. As you can see based on my science and math, RPGs aren’t fun for me any more. From there my brain started toying around with ideas of RPGs that would work for me. They would have to be a short RPG, you know one you could beat in a single play session. A dad sized play session. I started thinking, wouldn’t it be awesome if there was a fifteen minute RPG? That got me laughing and I took that as a good sign that it sounded fun and so I thought about it a little more. I immediately discarded the idea of doing a real RPG. I’m sure a short RPG like that could be done but it wasn’t the vibe that I had in my head. No this was going to be something that just had the core essence of an RPG. In reality what I’d be making would be more of an arcade style game. One with high scores and lots of crazy action on the screen. And that’s when it hit me. It would be a speed run RPG. That’s the basics of the game I’m working on.   The Elevator Pitch: It’s a 2D top down RPG themed arcade game focused on speed. It sounds like an RPG, smells like an RPG but it’s merely emulating an RPG. The game is focused on fun and mayhem in RPG form with players leveling up in seconds instead of hours and rushing to finish quests as quickly as possible because they’ve only got fifteen minutes before EVIL overtakes the world. If the player takes longer than fifteen minutes, it’s game over man. One to four player co-operative play to really see just how fast players can level up and beat the game. Gamers will compete on leaderboards for bragging rights for fastest 1, 2, 3, and 4 player speed runs, lowest leveled characters to beat the game, highest leveled characters to beat the game and so on. Times will be tracked for everything from how long a player sat distributing stats, equipping items, talking to NPCs to running around the level. These stats will be shown at the end of each quest/level so the players can work on improving their speed run for that part of the game next time around. It’s the perfect RPG for those of us who only have fifteen minutes of game time! Where I’m at: I’m still at the prototyping stage attempting to but all the basic framework pieces in place that will at minimum give me one level to rush through. I’ve been working on this prototype for about a month now though so I’m going to have to step it up a bit or I’m not going to get finished in time (remember I’ve only got 85 days left!) Lots of the game code is in place (although pretty sloppy) but I still can’t play through that first quest/level just yet. That’s my goal to finish up by the end of next Sunday (3/25/2012). You can all hold me to that and cheer me on or heckle me throughout the week. Either way that should help me stay a bit more motivated and focused. In my head this feels like it’s going to be a fun game so I’m looking forward to seeing how it actually plays!

    Read the article

  • Disaster, or Migration?

    - by Rob Farley
    This post is in two parts – technical and personal. And I should point out that it’s prompted in part by this month’s T-SQL Tuesday, hosted by Allen Kinsel. First, the technical: I’ve had a few conversations with people recently about migration – moving a SQL Server database from one box to another (sometimes, but not primarily, involving an upgrade). One question that tends to come up is that of downtime. Obviously there will be some period of time between the old server being available and the new one. The way that most people seem to think of migration is this: Build a new server. Stop people from using the old server. Take a backup of the old server Restore it on the new server. Reconfigure the client applications (or alternatively, configure the new server to use the same address as the old) Make the new server online. There are other things involved, such as testing, of course. But this is essentially the process that people tell me they’re planning to follow. The bit that I want to look at today (as you’ve probably guessed from my title) is the “backup and restore” section. If a SQL database is using the Simple Recovery Model, then the only restore option is the last database backup. This backup could be full or differential. The transaction log never gets backed up in the Simple Recovery Model. Instead, it truncates regularly to stay small. One that’s using the Full Recovery Model (or Bulk-Logged) won’t truncate its log – the log must be backed up regularly. This provides the benefit of having a lot more option available for restores. It’s a requirement for most systems of High Availability, because if you’re making sure that a spare box is up-and-running, ready to take over, then you have to be interested in the logs that are happening on the current box, rather than truncating them all the time. A High Availability system such as Mirroring, Replication or Log Shipping will initialise the spare machine by restoring a full database backup (and maybe a differential backup if available), and then any subsequent log backups. Once the secondary copy is close, transactions can be applied to keep the two in sync. The main aspect of any High Availability system is to have a redundant system that is ready to take over. So the similarity for migration should be obvious. If you need to move a database from one box to another, then introducing a High Availability mechanism can help. By turning on the Full Recovery Model and then taking a backup (so that the now-interesting logs have some context), logs start being kept, and are therefore available for getting the new box ready (even if it’s an upgraded version). When the migration is ready to occur, a failover can be done, letting the new server take over the responsibility of the old, just as if a disaster had happened. Except that this is a planned failover, not a disaster at all. There’s a fine line between a disaster and a migration. Failovers can be useful in patching, upgrading, maintenance, and more. Hopefully, even an unexpected disaster can be seen as just another failover, and there can be an opportunity there – perhaps to get some work done on the principal server to increase robustness. And if I’ve just set up a High Availability system for even the simplest of databases, it’s not necessarily a bad thing. :) So now the personal: It’s been an interesting time recently... June has been somewhat odd. A court case with which I was involved got resolved (through mediation). I can’t go into details, but my lawyers tell me that I’m allowed to say how I feel about it. The answer is ‘lousy’. I don’t regret pursuing it as long as I did – but in the end I had to make a decision regarding the commerciality of letting it continue, and I’m going to look forward to the days when the kind of money I spent on my lawyers is small change. Mind you, if I had a similar situation with an employer, I’d do the same again, but that doesn’t really stop me feeling frustrated about it. The following day I had to fly to country Victoria to see my grandmother, who wasn’t expected to last the weekend. She’s still around a week later as I write this, but her 92-year-old body has basically given up on her. She’s been a Christian all her life, and is looking forward to eternity. We’ll all miss her though, and it’s hard to see my family grieving. Then on Tuesday, I was driving back to the airport with my family to come home, when something really bizarre happened. We were travelling down the freeway, just pulled out to go past a truck (farm-truck sized, not a semi-trailer), when a car-sized mass of metal fell off it. It was something like an industrial air-conditioner, but from where I was sitting, it was just a mass of spinning metal, like something out of a movie (one friend described it as “holidays by Michael Bay”). Somehow, and I’m really don’t know how, the part of it nearest us bounced high enough to clear the car, and there wasn’t even a scratch. We pulled over the check, and I was just thanking God that we’d changed lanes when we had, and that we remained unharmed. I had all kinds of thoughts about what could’ve happened if we’d had something that size land on the windscreen... All this has drilled home that while I feel that I haven’t provided as well for the family as I could’ve done (like by pursuing an expensive legal case), I shouldn’t even consider that I have proper control over things. I get to live life, and make decisions based on what I feel is right at the time. But I’m not going to get everything right, and there will be things that feel like disasters, some which could’ve been in my control and some which are very much beyond my control. The case feels like something I could’ve pursued differently, a disaster that could’ve been avoided in some way. Gran dying is lousy of course. An accident on the freeway would have been awful. I need to recognise that the worst disasters are ones that I can’t affect, and that I need to look at things in context – perhaps seeing everything that happens as a migration instead. Life is never the same from one day to the next. Every event has a before and an after – sometimes it’s clearly positive, sometimes it’s not. I remember good events in my life (such as my wedding), and bad (such as the loss of my father when I was ten, or the back injury I had eight years ago). I’m not suggesting that I know how to view everything from the “God works all things for good” perspective, but I am trying to look at last week as a migration of sorts. Those things are behind me now, and the future is in God’s hands. Hopefully I’ve learned things, and will be able to live accordingly. I’ve come through this time now, and even though I’ll miss Gran, I’ll see her again one day, and the future is bright.

    Read the article

  • How to Post Javascript to a MySQL Database

    - by salientanimal
    Hi guys, I have created a form in HTLM and I m trying to post the information in the form to a MySQL database. My form make suse of a dynamic list selection that needs to be captured to the database. However when submtting the form I get the following error Error: Unknown column 'coulmn_name' in 'field list'. Here is my HTML CODE for the form <td height="94"><p align="justify">CALL TRACKER - ADMIN</p></td> Customer Name : E-Mail Address : </tr> <tr> <td width="29%" align="right" valign="middle"><strong>Case Number :</strong></td> <td> <input type="text" name="case_number" width="70%" align="left" valign="middle"> </td> </tr> <tr> <td width="29%" align="right" valign="middle"><strong>MSISDN :</strong></td> <td> <input type="text" name="msisdn" width="70%" align="left" valign="middle"> </td> </tr> <tr> <td width="29%" align="right" valign="middle"> <strong>Route Cause :</strong></td> <td width="71%" align="left" valign="middle"> <select name="route_cause" id="category" onChange="javascript: listboxchange1(this.options[this.selectedIndex].value);"> <!-- <select name="route_cause" id="route_cause"> --> <option value="">Select the Call Reason</option> <option value="Billing Admin">Billing Admin</option> <option value="Customer Care">Customer Care</option> <option value="Insurance">Insurance</option> <option value="Repairs">Repairs</option> <option value="SIM Swap">SIM Swap</option> <option value="UTI">UTI</option> </select> </td> </tr> <tr> <td align="right" valign="middle"> <strong>Call Type/Indexed To :</strong></td> <td align="left" valign="middle"> <script type="text/javascript" language="javascript" name="calltype_indexedto"> <!-- document.write('<select name="subcategory1" onChange="javascript: listboxchange2(this.options[this.selectedIndex].value);"><option value=""></option></select>') --> </script> </td> </tr> <tr> <td align="right" valign="middle"> <strong>Type/TAT :</strong></td> <td align="left" valign="middle"> <script type="text/javascript" language="javascript" name="type_tat"> <!-- document.write('<select name="subcategory2" onChange="javascript: listboxchange3(this.options[this.selectedIndex].value);"><option value=""></option></select>') --> </script> </td> </tr> <tr> <td width="29%" align="right" valign="middle"> <strong>Escalated To :</strong></td> <td width="71%" align="left" valign="middle"> <select name="escalatedto" id="escalated_to"> <option value="">Select the Escalation</option> <option value="Billing Ops">Billing Ops</option> <option value="Resolvers">Resolvers</option> <option value="Finance">Finance</option> <option value="Ressolver">Ressolver</option> <option value="Nudebt">Nudebt</option> <option value="Transunion">Transunion</option> <option value="N/A">N/A</option> </select> </td> </tr> <tr> <td width="29%" align="right" valign="middle"> <strong>Requested By :</strong></td> <td width="71%" align="left" valign="middle"> <select name="requestedby" id="requested_by"> <option value="">UTI Requested By</option> <option value="Billing">Billing</option> <option value="Customer Service">Customer Service</option> <option value="Insurance">Insurance</option> <option value="Management">Management</option> <option value="Repairs">Repairs</option> <option value="Retail Support">Retail Support</option> <option value="Retentions">Retentions</option> <option value="SIM Swap">SIM Swap</option> <option value="WOW">WOW</option> <option value="N/A">N/A</option> </select> </td> </tr> <tr> <td width="29%" align="right" valign="middle"> <strong>Province :</strong></td> <td width="71%" align="left" valign="middle"> <select name="province" id="province"> <option value="">Select the Province</option> <option value="Eastern Cape">Eastern Cape</option> <option value="Gauteng">Gauteng</option> <option value="Kwa-Zulu Natal">Kwa-Zulu Natal</option> <option value="Limpopo">Limpopo</option> <option value="Mpumalanga">Mpumalanga</option> <option value="North West">North West</option> <option value="Northern Cape">Northern Cape</option> <option value="Polokwane">Polokwane</option> <option value="Western Cape">Western Cape</option> <option value="Other">Other</option> </select> </td> </tr> <tr> <td width="29%" align="right" valign="middle"><strong>Comments :</strong></td> <td> <textarea rows ="5" cols="30" name="comments"> </textarea> </td> </tr> <tr> <td> <p> <input type="reset" value="Reset Form"><input type="Submit" value="Submit"> Here is my PHP CODE to write to the Database <?php $con = mysql_connect("hostname" ,"mysqusername" ,"mysqlpassword"); if (!$con) { die('Could not connect: ' . mysql_error()); } mysql_select_db("databasename", $con); $sql="INSERT INTO customer_services_tracker (customer_name ,customer_email_address ,case_number ,msisdn ,route_cause ,calltype_indexedto ,type_tat ,escalatedto ,requestedby ,province ,comments ) VALUES ('$_POST[customer_name]' ,'$_POST[customer_email_address]' ,'$_POST[case_number]' ,'$_POST[msisdn]' ,'$_POST[route_cause]' ,'$_POST[calltype_indexedto]' ,'$_POST[type_tat]' ,'$_POST[escalatedto]' ,'$_POST[requestedby]' ,'$_POST[province]' ,'$_POST[comments]')"; $CatName = $rowCat["Name"]; if (!mysql_query($sql,$con)) { die('Error: ' . mysql_error()); } echo "1 record added"; mysql_close($con) ?>

    Read the article

  • FOUR questions to ask if you are implementing DATABASE-AS-A-SERVICE

    - by Sudip Datta
    During my ongoing tenure at Oracle, I have met all types of DBAs. Happy DBAs, unhappy DBAs, proud DBAs, risk-loving DBAs, cautious DBAs. These days, as Database-as-a-Service (DBaaS) becomes more mainstream, I find some complacent DBAs who are basking in their achievement of having implemented DBaaS. Some others, however, are not that happy. They grudgingly complain that they did not have much of a say in the implementation, they simply had to follow what their cloud architects (mostly infrastructure admins) offered them. In most cases it would be a database wrapped inside a VM that would be labeled as “Database as a Service”. In other cases, it would be existing brute-force automation simply exposed in a portal. As much as I think that there is more to DBaaS than those approaches and often get tempted to propose Enterprise Manager 12c, I try to be objective. Neither do I want to dampen the spirit of the happy ones, nor do I want to stoke the pain of the unhappy ones. As I mentioned in my previous post, I don’t deny vanilla automation could be useful. I like virtualization too for what it has helped us accomplish in terms of resource management, but we need to scrutinize its merit on a case-by-case basis and apply it meaningfully. For DBAs who either claim to have implemented DBaaS or are planning to do so, I simply want to provide four key questions to ponder about: 1. Does it make life easier for your end users? Database-as-a-Service can have several types of end users. Junior DBAs, QA Engineers, Developers- each having their own skillset. The objective of DBaaS is to make their life simple, so that they can focus on their core responsibilities without having to worry about additional stuff. For example, if you are a Developer using Oracle Application Express (APEX), you want to deal with schema, objects and PL/SQL code and not with datafiles or listener configuration. If you are a QA Engineer needing database copies for functional testing, you do not want to deal with underlying operating system patching and compliance issues. The question to ask, therefore, is, whether DBaaS makes life easier for those users. It is often convenient to give them VM shells to deal with a la Amazon EC2 IaaS, but is that what they really want? Is it a productive use of a developer's time if he needs to apply RPM errata to his Linux operating system. Asking him to keep the underlying operating system current is like making a guest responsible for a restaurant's decor. 2. Does it make life easier for your administrators? Cloud, in general, is supposed to free administrators from attending to mundane tasks like provisioning services for every single end user request. It is supposed to enable a readily consumable platform and enforce standardization in the process. For example, if a Service Catalog exposes DBaaS of specific database versions and configurations, it, by its very nature, enforces certain discipline and standardization within the IT environment. What if, instead of specific database configurations, cloud allowed each end user to create databases of their liking resulting in hundreds of version and patch levels and thousands of individual databases. Therefore the right question to ask is whether the unwanted consequence of DBaaS is OS and database sprawl. And if so, who is responsible for tracking them, backing them up, administering them? Studies have shown that these administrative overheads increase exponentially with new targets, and it could result in a management nightmare. That leads us to our next question. 3. Does it satisfy your Security Officers and Compliance Auditors? Compliance Auditors need to know who did what and when. They also want the cloud platform to be secure, so that end users have little freedom in tampering with it. Dealing with VM sprawl is not the easiest of challenges, let alone dealing with them as they keep getting reconfigured and moved around. This leads to the proverbial needle in the haystack problem, and all it needs is one needle to cause a serious compliance issue in the enterprise. Bottomline is, flexibility and agility should not come at the expense of compliance and it is very important to get the balance right. Can we have security and isolation without creating compliance challenges? Instead of a ‘one size fits all approach’ i.e. OS level isolation, can we think smartly about database isolation or schema based isolation? This is where the appropriate resource modeling needs to be applied. The usual systems management vendors out there with heterogeneous common-denominator approach have compromised on these semantics. If you follow Enterprise Manager’s DBaaS solution, you will see that we have considered different models, not precluding virtualization, for different customer use cases. The judgment to use virtual assemblies versus databases on physical RAC versus Schema-as-a-Service in a single database, should be governed by the need of the applications and not by putting compliance considerations in the backburner. 4. Does it satisfy your CIO? Finally, does it satisfy your higher ups? As the sponsor of cloud initiative, the CIO is expected to lead an IT transformation project, not merely a run-of-the-mill IT operations. Simply virtualizing server resources and delivering them through self-service is a good start, but hardly transformational. CIOs may appreciate the instant benefit from server consolidation, but studies have revealed that the ROI from consolidation would flatten out at 20-25%. The question would be: what next? As we go higher up in the stack, the need to virtualize, segregate and optimize shifts to those layers that are more palpable to the business users. As Sushil Kumar noted in his blog post, " the most important thing to note here is the enterprise private cloud is not just an IT project, rather it is a business initiative to create an IT setup that is more aligned with the needs of today's dynamic and highly competitive business environment." Business users could not care less about infrastructure consolidation or virtualization - they care about business agility and service level assurance. Last but not the least, lot of CIOs get miffed if we ask them to throw away their existing hardware investments for implementing DBaaS. In Oracle, we always emphasize on freedom of choosing a platform; hence Enterprise Manager’s DBaaS solution is platform neutral. It can work on any Operating System (that the agent is certified on) Oracle’s hardware as well as 3rd party hardware. As a parting note, I urge you to remember these 4 questions. Remember that your satisfaction as an implementer lies in the satisfaction of others.

    Read the article

  • Regex Question ...

    - by kate
    Hi, Could someone help me with the following RegEx query: based on the following rules: 1) 1 letter followed by 4 letters or numbers, then 2) 5 letters or numbers, then 3) 3 letters or numbers followed by a number and one of the following signs: ! & @ ? You will have to allow customers to input the fidelity card code as a 15-character string, or as 3 groups of 5 chars, separated by one space.

    Read the article

  • SQLAuthority News – Interview with SQL Server MVP Madhivanan – A Real Problem Solver

    - by pinaldave
    Madhivanan (SQL Server MVP) is a real community hero. He is known for his two skills – 1) Help Community and 2) Help Community. I have met him many times and every time I feel if anybody in online world needs help Madhinvanan does his best to reach them out and solve problem. His name is not new if you are ready this blog or have ever asked a question in any online SQL forum. He is always there to help. When Madhivanan has time he even helps people on this blog as well. He spends his valuable time to help community only. He recently crossed over 1000 helpful comments on this blog. On that occasion, I have interviewed him to find out if he has any life outside SQL. Q 1. Tell us something about your self. I am Madhivanan ,an MSc computer Science graduate from Chennai, India and working as a Lead Analyst-Project at Ellaar Infotek Solutions Private Limited. I am basically a developer started with Visual Basic 6.0, SQL Server 2000 and Crystal Report 8. As years go on I started working more on writing queries in SQL Server in most of the projects developed in my company. I have some good level of knowledge in ORACLE, MySQL and PostgreSQL as well. Now I am leading a project develeoped in Windows Azure. Q 2. What motivates you to help people on community and forums. When I got some errors during the application development in my early days of my career, I got good solutions from online forums and weblogs. So I decided to help others if possible. When I visit forums and help people if I know the answer to the questions. I am one of the leading posters at www.sqlteam.com and also a moderator at www.sql-server-performance.com. I also take part in Visual Basic and Crystal Reports forums. I have been SQL Server MVP since 2007. Q 3. Your personal life is not much known. Tell us something about your personal life. I am happily married person. My wife is a B.Pharm graduate. I have a son who is now 18 months old. Q 4. Where can we read further for your community activity. I have a blog at http://beyondrelational.com/blogs/madhivanan where you can find most of my T-sql stuffs Q 5. When not working with SQL what do you do? When not working with SQL, I spend time playing with my son, reading some magazines and watching TV. Madhivanan for your work and help to community, a true salute to you. Hats off my friend. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: MVP, Readers Contribution, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Java 6 Certified with Forms and Reports 10g for EBS 12

    - by John Abraham
    Java 6 is now certified with Oracle Application Server 10g Forms and Reports with Oracle E-Business Suite Release 12 (12.0.6, 12.1.1 and higher). What? Wasn't this already certified? No, but a little background might be useful in understanding why this is a new announcement. We previously certified the use of Java 6 with E-Business Suite Release 12 -- with the sole exception of Oracle Application Server 10g components in the E-Business Suite technology stack. Oracle Application Server 10g originally included Java 1.4.2 as part of its distribution.  E-Business Suite 12 uses, amongst other things, the Oracle Forms and Reports 10g components running on Java 1.4. Java 1.4 in the Oracle Application Server 10g ORACLE_HOME is used exclusively by AS 10g Forms and Reports' for Java functionality.  This version of Java is separate from the Java distribution used by other parts of EBS such as Oracle Containers for Java (OC4J). What's new about this certification? You can now upgrade the older Java 1.4 libraries used by Oracle Forms & Reports 10g to Java 6. This allows you to upgrade the Java releases within the Oracle Application Server 10g ORACLE_HOME to the the same level as the rest of your E-Business Suite technology stack components. Why upgrade? This becomes particularly important for customers as individual vendors' support lifecycle for Java 1.4 reaches End of Life: Oracle's Sun JDK Release 1.4.2's End of Extended Support: February 2013 (Sustaining Support indefinitely after) IBM SDK and JRE 1.4.2's End of Service: September 2013 HP-UX Java 1.4.2's End-of-Life : May 2012 Along with Oracle Forms, Java lies at the heart of the Oracle E-Business Suite.  Small improvements in Java can have significant effects on the performance and stability of the E-Business Suite.  As a notable side-benefit, later versions of Java have improved built-in and third-party tools for JVM performance monitoring and tuning.Our standing recommendation is that you always stay current with the latest available Java update provided by your operating system vendor.  Don't forget to upgrade Forms & Reports to 10.1.2.3 E-Business Suite 12 originally shipped with Oracle Application Server 10g Forms & Reports 10.1.2.0.2.  That version is no longer eligible for Error Correction Support. New Forms and Reports 10g patches are now being released with Forms and Reports 10.1.2.3 as the prerequisite. Forms and Reports 10.1.2.3 was certified for EBS 12 environments in November 2008. If you haven't upgraded your EBS 12 environment to Forms & Reports 10.1.2.3, this is a good opportunity to do so. References Using Latest Update of Java 6.0 with Oracle E-Business Suite Release 12 (My Oracle Support Document 455492.1) Overview of Using Java with Oracle E-Business Suite Release 12 (My Oracle Support Document 418664.1) Oracle Lifetime Support Policy (Oracle Fusion Middleware) IBM Developer Kit Lifecycle Dates HP-UX Java - End of Life Policy & Release Naming Terminology Related Articles OracleAS 10g Forms and Reports 10.1.2.3 Certified With EBS R12 Java 6 Certified with E-Business Suite Release 12

    Read the article

  • If statement question iphone?

    - by NextRev
    I am creating a game where where you complete shapes and the area gets filled in. However, if there is an enemy bird within your shape, it will not fill in. I want to make it so that if you do trap a bird within your shape, you will lose a life. How can I write an if statement that pretty much says if the below code doesn't take place, then you lose a life. If it helps losing a life is called doDie in my code. -(void)fillMutablePath{ CGPoint movePoint = CGPointFromString([pointsToFillArray objectAtIndex:0]); CGPathMoveToPoint(fillPath, NULL, movePoint.x, movePoint.y); for (int i=0; i<[pointsToFillArray count]; i++) { CGPoint tempPoint = CGPointFromString([pointsToFillArray objectAtIndex:i]); CGPathAddLineToPoint(fillPath, NULL, tempPoint.x, tempPoint.y); } CGContextAddPath(gameViewObj._myContext, fillPath); CGContextFillPath(gameViewObj._myContext); CGPathRelease(fillPath); [pointsToFillArray removeAllObjects]; } if(fillMutablePath doesn't take place when making a shape){ [self doDie]; } Like i said above, the reason fillMutablePath wouldn't take place is because a bird would be trapped within the shape. Any help would be much appreciated!!

    Read the article

  • creating a thread in jsp and using join has problem

    - by Suresh S
    guys the following code does not wiat for the thread t to complete and join with the main thread, also the "created object" trace is called one more time after join is called please let me know the solution <% AProc empList = new AProc(); System.out.println("** Created Object ****"); System.out.println("Condition"+(!(p.length==0) && !(status) && !(threadLive))); if (!(p.length==0) && !(status) && !(threadLive)){ java.util.Date currDate = new java.util.Date(); try { Thread t = new Thread(empList); System.out.println("Check thread life 1 "+t.isAlive()); empList.setServletRequest(request , schemaName); System.out.println("Thread Started "+t); t.start(); System.out.println("Check thread life 2 "+t.isAlive()); if(t.isAlive()) { threadLive = true; } t.join(); while (t.isAlive()) { t.join(); } System.out.println("Check thread life 3 "+t.isAlive()); System.out.println("Thread End "+t); //} } catch (Exception e) { System.out.println("Exception in thread while running Procedure "+e); status=false; threadLive=false; } status = e.getStatusProc(); }%>

    Read the article

  • How to corrupt my hard disk ?

    - by amit
    Hi superuser, Please tell me how to corrupt my hard disk. I have Dell Inspiron 6400 & it's under complete cover insurance. Operating system is WindowS Vista Premium. So please tell me the method which actually work & damage/corrupt my hd.

    Read the article

  • Selection between two laptops for casual gaming [closed]

    - by Prabhpreet
    I have selected two laptops that meet my budget. Here are the differences: Laptop #1: 4GB Ram, Intel Core i5 2450M 2nd Gen processor w/ 2.5 Ghz clockspeed NVIDIA GeForce GT 520MX DDR3 1 GB Dedicated Graphics 750 GB SATA II Hard disk USB 2.0 Ports 6 hrs battery life Laptop #2: 4 GB Ram, Intel Core i5 3210M 3rd Gen processor w/ 2.5 Ghz clockspeed Integrated Intel HD Graphics 4000 500 GB SATA Hard disk USB 3.0 Ports 3 hrs battery life (This concerns me) First and foremost, does dedicated graphics matter for a casual gamer like me? Secondly, does the generation of the processors make a difference despite the same clockspeed. Thirdly, do usb 3.0 ports make a difference? And lastly, which laptop is more future proof? Please help me out. Thanks!

    Read the article

  • Adeos's role w.r.t Linux

    - by Anisha Kaul
    The event pipeline The fundamental Adeos structure one must keep in mind is the chain of client domains asking for event control. A domain is a kernelbased software component which can ask the Adeos layer to be notified of: · Every incoming external interrupt, or autogenerated virtual interrupt; · Every system call issued by Linux applications, · Other system events triggered by the kernel code (e.g. Linux task switching, signal notification, Linux task exits etc.). From: Life with Adeos: http://www.xenomai.org/documentation/xenomai-2.4/pdf/Life-with-Adeos-rev-B.pdf Question: Adeos is supposed to be between the hardware and the Linux kernel, I can understand about Adeos telling the Linux about hardware interrupts but Why should Adeos know about the "system call" issued by Linux?

    Read the article

  • Dual boot OSX and Windows 7 natively

    - by Phill
    I'm considering getting one of those new fancy Mac Book Pro's with the fancy screens, but after reading some stuff on the internets about running Windows 7 with bootcamp: https://discussions.apple.com/thread/2770866?start=0&tstart=0 It seems you can't use the integrated graphics with windows, this causes windows to chew the battery life: I am afraid it is not possible. Since Apple introduced dual graphics chip laptops, they kept the low power/embedded GPU hidden under Window and they expose only the power hungry discrete GPU. It feels that this is being done on purpose so that it appears to users that OS X offers a better experience and battery life over Windows. So running bootcamp and windows kills the battery, running in parallels means you don't get accelerated 3d support (or something along those lines), so you don't get the performance out of it. I'm wondering: Is it possible to natively dual boot Windows 7 on a MBP, and if so would/does that give windows access to the integrated graphics to be able to not rape the battery?

    Read the article

  • Drawbacks of installing linux on usb stick?

    - by Znarkus
    I am setting up a router/nas/http/whatever server based on an ION mini-ITX board. I've installed Ubuntu Server on an old 160 GB drive, but it generates a lot more heat and vibrates more than my other new drive (storage). It just doesn't fit the concept, and worse: it takes up a SATA port. As SSD's are crazy expensive I'm thinking of buying an extra 4 GB USB stick, and raid0 it. From my point of view, these are the pros/cons: Pros Low power consumption No vibrations No heat Smaller Get to buy new, larger USB stick (:D) Cons Shorter life time Slower Raid 0 More work maintaing/installing? I think the pros overweighs the cons. Shorter life time and raid 0 is countered by regular backups of the configs/settings. Slower is partially countered by raid 0, and I don't know about the last one. What do You think? Experience? Another solution?

    Read the article

  • Folder Size Column on Explorer on Windows Vista/Seven

    - by Click Ok
    I'm a big fan of FolderSize, but unfortunately it works only on Windows XP. Even reading this and this, I'm not convinced that I cannot to have a column showing the folder size on Windows Explorer. Even with all "problems" FolderSize worked like a charm in WindowsXP. In a sysadmin life, FolderSize is explendid. Before select a lot of folders to send to backup in DVDs, I can check directly in Windows Explorer the size of the folders and get a set of folders with 4.3Gb to burn in a DVD. In another situation, I can view in the root folder the size of the bigger folders in the hard drive and start a good strategy of backup/partitioning/transfer to another drive/etc. If desired, I can tell a lot of another needs that in my sysadmin life I need a tool like FolderSize... There is someone that is actively developing a solution to show folder size on Windows Explorer in Vista/Seven Windows? What the problems that I can face if I develop myself that "add-in" for Windows Explorer?

    Read the article

  • Finding a laptop to fit my spec's

    - by Mick
    I have a small list of characteristics that are required for a new laptop and have been looking for websites where I can specify my requirements and see which laptops fit the bill. I want to specify OS/RAM/HDD size/CD drive/screen resolution/battery life. I have found several sites so far where I can specify everything except battery life. Does such a site exist? I know that this is not a shopping site - but before any moderator rushes to close this question - please note that I am not asking "where do I buy this laptop" - merely what laptop fits this specs. I don't even care what country the website is based in.

    Read the article

  • help setting up an IPSEC vpn from my linux box

    - by robthewolf
    I have an office with a router and a remote server (Linux - Ubuntu 10.10). Both locations need to connect to a data supplier through a VPN. The VPN is an IPSEC gateway. I was able to configure my Linksys rv42 router to create a VPN connection successfully and now I need to do the same for Linux server. I have been messing around with this for too long. First I tried OpenVPN, but that is SSL and not IPSEC. Then I tried Shrew. I think I have the settings correct but I haven't been able to create the connection. It maybe that I have to use something else like a direct IPSEC config or something like that. If someone knows of a way to turn the following settings that I have been given below into a working IPSEC VPN connection I would be very grateful. Here are the settings I was given that must be used to connect to my supplier: Local destination network: 192.168.4.0/24 Local destination hosts: 192.168.4.100 Remote destination network: 192.167.40.0/24 Remote destination hosts: 192.168.40.27 VPN peering point: xxx.xxx.xxx.xxx Then they have given me the following details: IPSEC/ISAKMP Phase 1 Parameters: Authentication method: pre shared secret Diffie Hellman group: group 2 Encryption Algorithm: 3DES Lifetime in seconds:28800 Phase 2 parameters: IPSEC security: ESP Encryption algortims: 3DES Authentication algorithms: MD5 lifetime in seconds: 28800 pfs: disabled Here are the settings from my attempt to use shrew: n:version:2 n:network-ike-port:500 n:network-mtu-size:1380 n:client-addr-auto:0 n:network-frag-size:540 n:network-dpd-enable:1 n:network-notify-enable:1 n:client-banner-enable:1 n:client-dns-used:1 b:auth-mutual-psk:YjJzN2QzdDhyN2EyZDNpNG42ZzQ= n:phase1-dhgroup:2 n:phase1-keylen:0 n:phase1-life-secs:28800 n:phase1-life-kbytes:0 n:vendor-chkpt-enable:0 n:phase2-keylen:0 n:phase2-pfsgroup:-1 n:phase2-life-secs:28800 n:phase2-life-kbytes:0 n:policy-nailed:0 n:policy-list-auto:1 n:client-dns-auto:1 n:network-natt-port:4500 n:network-natt-rate:15 s:client-dns-addr:0.0.0.0 s:client-dns-suffix: s:network-host:xxx.xxx.xxx.xxx s:client-auto-mode:pull s:client-iface:virtual s:client-ip-addr:192.168.4.0 s:client-ip-mask:255.255.255.0 s:network-natt-mode:enable s:network-frag-mode:disable s:auth-method:mutual-psk s:ident-client-type:address s:ident-client-data:192.168.4.0 s:ident-server-type:address s:ident-server-data:192.168.40.0 s:phase1-exchange:aggressive s:phase1-cipher:3des s:phase1-hash:md5 s:phase2-transform:3des s:phase2-hmac:md5 s:ipcomp-transform:disabled Finally here is the debug output from the shrew log: 10/12/22 17:22:18 ii : ipc client process thread begin ... 10/12/22 17:22:18 < A : peer config add message 10/12/22 17:22:18 DB : peer added ( obj count = 1 ) 10/12/22 17:22:18 ii : local address 217.xxx.xxx.xxx selected for peer 10/12/22 17:22:18 DB : tunnel added ( obj count = 1 ) 10/12/22 17:22:18 < A : proposal config message 10/12/22 17:22:18 < A : proposal config message 10/12/22 17:22:18 < A : client config message 10/12/22 17:22:18 < A : local id '192.168.4.0' message 10/12/22 17:22:18 < A : remote id '192.168.40.0' message 10/12/22 17:22:18 < A : preshared key message 10/12/22 17:22:18 < A : peer tunnel enable message 10/12/22 17:22:18 DB : new phase1 ( ISAKMP initiator ) 10/12/22 17:22:18 DB : exchange type is aggressive 10/12/22 17:22:18 DB : 217.xxx.xxx.xxx:500 <- 206.xxx.xxx.xxx:500 10/12/22 17:22:18 DB : c1a8b31ac860995d:0000000000000000 10/12/22 17:22:18 DB : phase1 added ( obj count = 1 ) 10/12/22 17:22:18 : security association payload 10/12/22 17:22:18 : - proposal #1 payload 10/12/22 17:22:18 : -- transform #1 payload 10/12/22 17:22:18 : key exchange payload 10/12/22 17:22:18 : nonce payload 10/12/22 17:22:18 : identification payload 10/12/22 17:22:18 : vendor id payload 10/12/22 17:22:18 ii : local supports nat-t ( draft v00 ) 10/12/22 17:22:18 : vendor id payload 10/12/22 17:22:18 ii : local supports nat-t ( draft v01 ) 10/12/22 17:22:18 : vendor id payload 10/12/22 17:22:18 ii : local supports nat-t ( draft v02 ) 10/12/22 17:22:18 : vendor id payload 10/12/22 17:22:18 ii : local supports nat-t ( draft v03 ) 10/12/22 17:22:18 : vendor id payload 10/12/22 17:22:18 ii : local supports nat-t ( rfc ) 10/12/22 17:22:18 : vendor id payload 10/12/22 17:22:18 ii : local supports DPDv1 10/12/22 17:22:18 : vendor id payload 10/12/22 17:22:18 ii : local is SHREW SOFT compatible 10/12/22 17:22:18 : vendor id payload 10/12/22 17:22:18 ii : local is NETSCREEN compatible 10/12/22 17:22:18 : vendor id payload 10/12/22 17:22:18 ii : local is SIDEWINDER compatible 10/12/22 17:22:18 : vendor id payload 10/12/22 17:22:18 ii : local is CISCO UNITY compatible 10/12/22 17:22:18 = : cookies c1a8b31ac860995d:0000000000000000 10/12/22 17:22:18 = : message 00000000 10/12/22 17:22:18 - : send IKE packet 217.xxx.xxx.xxx:500 - 206.xxx.xxx.xxx:500 ( 484 bytes ) 10/12/22 17:22:18 DB : phase1 resend event scheduled ( ref count = 2 ) 10/12/22 17:22:18 ii : opened tap device tap0 10/12/22 17:22:28 - : resend 1 phase1 packet(s) 217.xxx.xxx.xxx:500 - 206.xxx.xxx.xxx:500 10/12/22 17:22:38 - : resend 1 phase1 packet(s) 217.xxx.xxx.xxx:500 - 206.xxx.xxx.xxx:500 10/12/22 17:22:48 - : resend 1 phase1 packet(s) 217.xxx.xxx.xxx:500 - 206.xxx.xxx.xxx:500 10/12/22 17:22:58 ii : resend limit exceeded for phase1 exchange 10/12/22 17:22:58 ii : phase1 removal before expire time 10/12/22 17:22:58 DB : phase1 deleted ( obj count = 0 ) 10/12/22 17:22:58 ii : closed tap device tap0 10/12/22 17:22:58 DB : tunnel stats event canceled ( ref count = 1 ) 10/12/22 17:22:58 DB : removing tunnel config references 10/12/22 17:22:58 DB : removing tunnel phase2 references 10/12/22 17:22:58 DB : removing tunnel phase1 references 10/12/22 17:22:58 DB : tunnel deleted ( obj count = 0 ) 10/12/22 17:22:58 DB : removing all peer tunnel refrences 10/12/22 17:22:58 DB : peer deleted ( obj count = 0 ) 10/12/22 17:22:58 ii : ipc client process thread exit ...

    Read the article

  • How does one remove an encryption type from a kerberos principal?

    - by 84104
    I would like to remove all of the des keys from the principal below, but have no idea how to do so without someone inputting the password. kadmin: getprinc user Principal: [email protected] Expiration date: [never] Last password change: Thu May 26 08:52:51 PDT 2013 Password expiration date: [none] Maximum ticket life: 0 days 12:00:00 Maximum renewable life: 7 days 00:00:00 Last modified: Tue Jul 16 15:17:18 PDT 2013 (administrator/[email protected]) Last successful authentication: Wed Jul 24 14:40:53 PDT 2013 Last failed authentication: [never] Failed password attempts: 0 Number of keys: 8 Key: vno 3, aes256-cts-hmac-sha1-96, no salt Key: vno 3, arcfour-hmac, no salt Key: vno 3, des3-cbc-sha1, no salt Key: vno 3, des-cbc-crc, no salt Key: vno 3, des-cbc-md5, no salt Key: vno 3, des-cbc-md5, Version 5 - No Realm Key: vno 3, des-cbc-md5, Version 5 - Realm Only Key: vno 3, des-cbc-md5, AFS version 3 MKey: vno 2 Attributes: REQUIRES_PRE_AUTH Policy: [none] Also, the the kdc is using an OpenLDAP backend.

    Read the article

  • Is it safe to operate a laptop without battery?

    - by leladax
    I know it's 'unsafe' in terms of data loss but I noticed motherboards still have some of their circuits on power when they are plugged in [e.g. a circuit that must wait for power-on signals is certainly one of them]. Hence, I wondered if it would increase the life of the laptop if the battery was simply off. Let alone that may also increase battery life, but that's the least of my concerns. Notice the main point is to plug it off on hibernate and have no power source whatsoever for the duration of being off (apart from the clock battery). (i.e. saving having to plug off the battery every time)

    Read the article

< Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >