Search Results

Search found 20864 results on 835 pages for 'account management'.

Page 16/835 | < Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >

  • Mark Wilcox Discusses Privileged Account Management

    - by Naresh Persaud
    96 800x600 Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:Calibri;} The new release of Oracle Identity Management 11g R2 includes the capability to manage privileged accounts. Privileged accounts, if compromised, create a risk for fraud in the enterprise and as a result controlling access to privileged accounts is critical. The Oracle Privileged Account Manager solution can be deployed stand alone or in conjunction with the Oracle Governance Suite for a comprehensive solution. As part of the comprehensive platform, Privilege Account Manager is interoperable with the Identity suite. In addition, Privileged Account Manager can re-use Oracle Identity Manager connectors for propagating changes to target systems. The two are interoperable at the data level. I caught up with Mark Wilcox, Principal Product Manager of Oracle Privileged Account Manager and discussed with him the capabilities of the offering in this podcast. Click here to listen.

    Read the article

  • Reference Data Management and Master Data: Are Relation ?

    - by Mala Narasimharajan
    Submitted By:  Rahul Kamath  Oracle Data Relationship Management (DRM) has always been extremely powerful as an Enterprise Master Data Management (MDM) solution that can help manage changes to master data in a way that influences enterprise structure, whether it be mastering chart of accounts to enable financial transformation, or revamping organization structures to drive business transformation and operational efficiencies, or restructuring sales territories to enable equitable distribution of leads to sales teams following the acquisition of new products, or adding additional cost centers to enable fine grain control over expenses. Increasingly, DRM is also being utilized by Oracle customers for reference data management, an emerging solution space that deserves some explanation. What is reference data? How does it relate to Master Data? Reference data is a close cousin of master data. While master data is challenged with problems of unique identification, may be more rapidly changing, requires consensus building across stakeholders and lends structure to business transactions, reference data is simpler, more slowly changing, but has semantic content that is used to categorize or group other information assets – including master data – and gives them contextual value. In fact, the creation of a new master data element may require new reference data to be created. For example, when a European company acquires a US business, chances are that they will now need to adapt their product line taxonomy to include a new category to describe the newly acquired US product line. Further, the cross-border transaction will also result in a revised geo hierarchy. The addition of new products represents changes to master data while changes to product categories and geo hierarchy are examples of reference data changes.1 The following table contains an illustrative list of examples of reference data by type. Reference data types may include types and codes, business taxonomies, complex relationships & cross-domain mappings or standards. Types & Codes Taxonomies Relationships / Mappings Standards Transaction Codes Industry Classification Categories and Codes, e.g., North America Industry Classification System (NAICS) Product / Segment; Product / Geo Calendars (e.g., Gregorian, Fiscal, Manufacturing, Retail, ISO8601) Lookup Tables (e.g., Gender, Marital Status, etc.) Product Categories City à State à Postal Codes Currency Codes (e.g., ISO) Status Codes Sales Territories (e.g., Geo, Industry Verticals, Named Accounts, Federal/State/Local/Defense) Customer / Market Segment; Business Unit / Channel Country Codes (e.g., ISO 3166, UN) Role Codes Market Segments Country Codes / Currency Codes / Financial Accounts Date/Time, Time Zones (e.g., ISO 8601) Domain Values Universal Standard Products and Services Classification (UNSPSC), eCl@ss International Classification of Diseases (ICD) e.g., ICD9 à IC10 mappings Tax Rates Why manage reference data? Reference data carries contextual value and meaning and therefore its use can drive business logic that helps execute a business process, create a desired application behavior or provide meaningful segmentation to analyze transaction data. Further, mapping reference data often requires human judgment. Sample Use Cases of Reference Data Management Healthcare: Diagnostic Codes The reference data challenges in the healthcare industry offer a case in point. Part of being HIPAA compliant requires medical practitioners to transition diagnosis codes from ICD-9 to ICD-10, a medical coding scheme used to classify diseases, signs and symptoms, causes, etc. The transition to ICD-10 has a significant impact on business processes, procedures, contracts, and IT systems. Since both code sets ICD-9 and ICD-10 offer diagnosis codes of very different levels of granularity, human judgment is required to map ICD-9 codes to ICD-10. The process requires collaboration and consensus building among stakeholders much in the same way as does master data management. Moreover, to build reports to understand utilization, frequency and quality of diagnoses, medical practitioners may need to “cross-walk” mappings -- either forward to ICD-10 or backwards to ICD-9 depending upon the reporting time horizon. Spend Management: Product, Service & Supplier Codes Similarly, as an enterprise looks to rationalize suppliers and leverage their spend, conforming supplier codes, as well as product and service codes requires supporting multiple classification schemes that may include industry standards (e.g., UNSPSC, eCl@ss) or enterprise taxonomies. Aberdeen Group estimates that 90% of companies rely on spreadsheets and manual reviews to aggregate, classify and analyze spend data, and that data management activities account for 12-15% of the sourcing cycle and consume 30-50% of a commodity manager’s time. Creating a common map across the extended enterprise to rationalize codes across procurement, accounts payable, general ledger, credit card, procurement card (P-card) as well as ACH and bank systems can cut sourcing costs, improve compliance, lower inventory stock, and free up talent to focus on value added tasks. Change Management: Point of Sales Transaction Codes and Product Codes In the specialty finance industry, enterprises are confronted with usury laws – governed at the state and local level – that regulate financial product innovation as it relates to consumer loans, check cashing and pawn lending. To comply, it is important to demonstrate that transactions booked at the point of sale are posted against valid product codes that were on offer at the time of booking the sale. Since new products are being released at a steady stream, it is important to ensure timely and accurate mapping of point-of-sale transaction codes with the appropriate product and GL codes to comply with the changing regulations. Multi-National Companies: Industry Classification Schemes As companies grow and expand across geographies, a typical challenge they encounter with reference data represents reconciling various versions of industry classification schemes in use across nations. While the United States, Mexico and Canada conform to the North American Industry Classification System (NAICS) standard, European Union countries choose different variants of the NACE industry classification scheme. Multi-national companies must manage the individual national NACE schemes and reconcile the differences across countries. Enterprises must invest in a reference data change management application to address the challenge of distributing reference data changes to downstream applications and assess which applications were impacted by a given change. References 1 Master Data versus Reference Data, Malcolm Chisholm, April 1, 2006.

    Read the article

  • WolframAlpha Can Now Do In-depth Analysis of Your Facebook Account

    - by Jason Fitzpatrick
    If you’re a big fan of WolframAlpha’s ability to crunch the numbers on just about anything–and we certainly are–you’ll likely be just as delighted as we were to watch it massage the data from your Facebook account. Find out your most liked, discussed, and shared posts, see your Facebook habits, and other neat trends. I unleashed it on my account this morning, not sure what to expect from the results. Within the results tabulation WolframAlpha provided me with all sorts of neat data break downs. I now know exactly how many days it is to my next birthday, the composition of my aggregate posting habits (how many posts are status updates, links, or photos), the time of day when I do the most posting (and what the composition of those posts is), and my average post length. I also know my most liked post and my most commented on post. It will even crunch the numbers on your network of friends (60.6% of my friends are married, for example). By far one of the more interesting data analysis it does on the friendship data, however, is organizing all your friends into relationship clusters so you can see who in your Facebook network is friends with other people in your Facebook network. The service from WolframAlpha is free: simply visit the WolframAlpha search portal and type in “Facebook report” to start the process. You’ll be prompted to create a WolframAlpha account if you don’t have one and to authorize the WolframAlpha Facebook app to access your data. Your Facebook data is cached to your WolframAlpha account for one hour in order to crunch the numbers and display the results. WolframAlpha HTG Explains: How Windows Uses The Task Scheduler for System Tasks HTG Explains: Why Do Hard Drives Show the Wrong Capacity in Windows? Java is Insecure and Awful, It’s Time to Disable It, and Here’s How

    Read the article

  • How to decide on going into management?

    - by Rob Wells
    I read the transcript of a speech by Richard Hamming included as a part of this SO question and the speech had a quote that got me thinking about when someone should move into development. When your vision of what you want to do is what you can do single-handedly, then you should pursue it. The day your vision, what you think needs to be done, is bigger than what you can do single-handedly, then you have to move toward management. And the bigger the vision is, the farther in management you have to go. Any other suggestions as to how you can decide if you want to move away from the coal face and into management?

    Read the article

  • cannot log in to all account except guest after mounting new partition

    - by student
    I really really need help. to do my homework from school, since two days ago, I was installing oracle 11g on my ubuntu. through that process, there was lots of problems also, but searching a lot, in somehow I could go to the next phase installing it. (because it was the first time to install on ubuntu by myself..) but when I was making password for oracle and clicked next, it says there are not enough space for home/oracle folder, So to make another space, I mounted another space after checking Gparted and using Software device management, I mounted it. And when I rebooted it, since then, I cannot log in to my own administrator account and even to oracle account(that I made for oracle 11g on ubuntu ) when I type the passwords for it, it seems to work but it redisplay the first page to log in, without really logging in it.. So I loggined to here as guest account... and even I cannot try gnome console manipulation to restore or fix it.. because it is guest account... I`m struggling for three days and... help me, how do I need to do? and at the time oracle said it needs enough space for its home folder, how I should have done?

    Read the article

  • How to Easily Put a Windows PC into Kiosk Mode With Assigned Access

    - by Chris Hoffman
    Windows 8.1′s Assigned Access feature allows you to easily lock a Windows PC to a single application, such as a web browser. This feature makes it easy for anyone to configure Windows 8.1 devices as point-of-sale or other kiosk systems. In the past, setting up a Windows PC in kiosk mode involved much more work, requiring the use of third-party software, group policy, or Linux distributions designed around kiosk mode. Assigned Access is available on Windows 8.1 RT, Windows 8.1 Professional, and Windows 8.1 Enterprise. The standard edition of Windows 8.1 doesn’t support Assigned Access. Create a User Account for Assigned Access Rather than turn your entire computer into a locked-down kiosk system, Assigned Access allows you to create a separate user account that can only launch a single app — such as a web browser. To set this up, you must be logged into Windows as a user with administrator permissions. First, open the PC settings app — swipe in from the right or press Windows Key + C to open the charms bar, tap Settings, and tap Change PC settings. In the PC settings app, select Accounts and select Other accounts. Use the Add an account button to create a new Windows account. Select  the “Sign in without a Microsoft account” option and select Local account to create a local user account. You could also create a Microsoft account, but you may not want to do this if you just want a locked-down account with only browser access. If you need to install apps from the Windows Store to use in Assigned Access mode, you’ll have to set up a Microsoft account instead of a local account. A local account will still allow you access to the preinstalled apps, such as Internet Explorer. You may want to create a user account with a blank password. This would make it simple for anyone to access kiosk mode, even if the system becomes locked or needs to be rebooted. The account will be created as a standard user account with limited permissions. Leave it as a standard user account — don’t make it an administrator account. Set Up Assigned Access Once you’ve created an account, you’ll first need to sign into it. If you don’t, you’ll see a “This account has no apps” message when trying to enable Assigned Access. Go back to the welcome screen, log in to the new account you created, and allow Windows to go through the first-time account setup process. If you want to use a non-default app in kiosk mode, install it while logged in as that user account. Once you’re done, log out of the other account, log back in as your administrator account, and go back to the Other accounts screen. Click the Set up an account for assigned access option to continue. Select the user account you created and select the app you want to limit the account to. For a web-based kiosk, this can be a web browser such as the Modern version of Internet Explorer. Businesses can also create their own Modern apps and set them to run in kiosk mode in this way. Note that Microsoft’s documentation says “web browsers are not good choices for assigned access” because they require more permissions than average Modern (or “Windows Store”) apps. However, if you want to provide a kiosk for web-browsing, using Assigned Access is a much better option than using Guest Mode and offering up a full Windows desktop. When you’re done, restart your PC and log in as the Assigned Access account. Windows will automatically open the app you chose and won’t allow a user to leave that app. Standard Windows 8 features like the charms bar, app switcher, and Start screen won’t appear. Pressing the Windows key once will do nothing. To sign out of Assigned Access mode, press the Windows key five times — quickly — while signed in. You’ll be sent back to the standard login screen. The account will actually still be logged in and the app will remain running — this method just “locks” the screen and allows another user to log in. Automatically Log Into Assigned Access Whenever your Windows device boots, you can log into the Assigned Access account and turn it into a kiosk system. While this isn’t ideal for all kiosk systems, you may want the device to automatically launch the specific app when it boots without requiring any login process. To do so, you’ll just need to have Windows automatically log into the Assigned Access account when it boots. This option is hidden and not available in the standard Control Panel. You’ll need to use the hidden netplwiz Control Panel tool to set up automatic login on boot. If you didn’t create a password for the user account, leave the Password field empty while configuring this. Security Considerations If you’re using this feature to turn a Windows 8.1 system into a kiosk and leaving it open to the public, remember to consider security. Anyone could come up to the system, press the Windows key five times, and try to log into your standard administrator user account. Ensure the administrator user account has a strong password so people won’t be able to get past the kiosk system’s limitations and tamper with the system. Even Windows 8′s detractors have to admit that it’s an ideal system for a touch-screen kiosk device, running either a browser or another specific application. Assigned Access finally makes this easy to set up on Windows systems in the real world — no IT experience, third-party software, or Linux distributions necessary.     

    Read the article

  • Five Key Strategies in Master Data Management

    - by david.butler(at)oracle.com
    Here is a very interesting Profit Magazine article on MDM: A recent customer survey reveals the deleterious effects of data fragmentation. by Trevor Naidoo, December 2010   Across industries and geographies, IT organizations have grown in complexity, whether due to mergers and acquisitions, or decentralized systems supporting functional or departmental requirements. With systems architected over time to support unique, one-off process needs, they are becoming costly to maintain, and the Internet has only further added to the complexity. Data fragmentation has become a key inhibitor in delivering flexible, user-friendly systems. The Oracle Insight team conducted a survey assessing customers' master data management (MDM) capabilities over the past two years to get a sense of where they are in terms of their capabilities. The responses, by 27 respondents from six different industries, reveal five key areas in which customers need to improve their data management in order to get better financial results. 1. Less than 15 percent of organizations surveyed understand the sources and quality of their master data, and have a roadmap to address missing data domains. Examples of the types of master data domains referred to are customer, supplier, product, financial and site. Many organizations have multiple sources of master data with varying degrees of data quality in each source -- customer data stored in the customer relationship management system is inconsistent with customer data stored in the order management system. Imagine not knowing how many places you stored your customer information, and whether a customer's address was the most up to date in each source. In fact, more than 55 percent of the respondents in the survey manage their data quality on an ad-hoc basis. It is important for organizations to document their inventory of data sources and then profile these data sources to ensure that there is a consistent definition of key data entities throughout the organization. Some questions to ask are: How do we define a customer? What is a product? How do we define a site? The goal is to strive for one common repository for master data that acts as a cross reference for all other sources and ensures consistent, high-quality master data throughout the organization. 2. Only 18 percent of respondents have an enterprise data management strategy to ensure that data is treated as an asset to the organization. Most respondents handle data at the department or functional level and do not have an enterprise view of their master data. The sales department may track all their interactions with customers as they move through the sales cycle, the service department is tracking their interactions with the same customers independently, and the finance department also has a different perspective on the same customer. The salesperson may not be aware that the customer she is trying to sell to is experiencing issues with existing products purchased, or that the customer is behind on previous invoices. The lack of a data strategy makes it difficult for business users to turn data into information via reports. Without the key building blocks in place, it is difficult to create key linkages between customer, product, site, supplier and financial data. These linkages make it possible to understand patterns. A well-defined data management strategy is aligned to the business strategy and helps create the governance needed to ensure that data stewardship is in place and data integrity is intact. 3. Almost 60 percent of respondents have no strategy to integrate data across operational applications. Many respondents have several disparate sources of data with no strategy to keep them in sync with each other. Even though there is no clear strategy to integrate the data (see #2 above), the data needs to be synced and cross-referenced to keep the business processes running. About 55 percent of respondents said they perform this integration on an ad hoc basis, and in many cases, it is done manually with the help of Microsoft Excel spreadsheets. For example, a salesperson needs a report on global sales for a specific product, but the product has different product numbers in different countries. Typically, an analyst will pull all the data into Excel, manually create a cross reference for that product, and then aggregate the sales. The exact same procedure has to be followed if the same report is needed the following month. A well-defined consolidation strategy will ensure that a central cross-reference is maintained with updates in any one application being propagated to all the other systems, so that data is synchronized and up to date. This can be done in real time or in batch mode using integration technology. 4. Approximately 50 percent of respondents spend manual efforts cleansing and normalizing data. Information stored in various systems usually follows different standards and formats, making it difficult to match the data. A customer's address can be stored in different ways using a variety of abbreviations -- for example, "av" or "ave" for avenue. Similarly, a product's attributes can be stored in a number of different ways; for example, a size attribute can be stored in inches and can also be entered as "'' ". These types of variations make it difficult to match up data from different sources. Today, most customers rely on manual, heroic efforts to match, cleanse, and de-duplicate data -- clearly not a scalable, sustainable model. To solve this challenge, organizations need the ability to standardize data for customers, products, sites, suppliers and financial accounts; however, less than 10 percent of respondents have technology in place to automatically resolve duplicates. It is no wonder, therefore, that we get communications about products we don't own, at addresses we don't reside, and using channels (like direct mail) we don't like. An all-too-common example of a potential challenge follows: Customers end up receiving duplicate communications, which not only impacts customer satisfaction, but also incurs additional mailing costs. Cleansing, normalizing, and standardizing data will help address most of these issues. 5. Only 10 percent of respondents have the ability to share data that was mastered in a master data hub. Close to 60 percent of respondents have efforts in place that profile, standardize and cleanse data manually, and the output of these efforts are stored in spreadsheets in various parts of the organization. This valuable information is not easily shared with the rest of the organization and, more importantly, this enriched information cannot be sent back to the source systems so that the data is fixed at the source. A key benefit of a master data management strategy is not only to clean the data, but to also share the data back to the source systems as well as other systems that need the information. Aside from the source systems, another key beneficiary of this data is the business intelligence system. Having clean master data as input to business intelligence systems provides more accurate and enhanced reporting.  Characteristics of Stellar MDM When deciding on the right master data management technology, organizations should look for solutions that have four main characteristics: enterprise-grade MDM performance complete technology that can be rapidly deployed and addresses multiple business issues end-to-end MDM process management with data quality monitoring and assurance pre-built MDM business relevant applications with data stores and workflows These master data management capabilities will aid in moving closer to a best-practice maturity level, delivering tremendous efficiencies and savings as well as revenue growth opportunities as a result of better understanding your customers.  Trevor Naidoo is a senior director in Industry Strategy and Insight at Oracle. 

    Read the article

  • How to Disable Ubuntu’s Guest Session Account

    - by Chris Hoffman
    Ubuntu and Linux Mint come with a “Guest Session” account, which anyone can log into from the login screen – no password required. If you’d rather restrict access to your computer, you can disable the guest account. This guest account is locked down and changes to it don’t persist between sessions – everyone that logs in gets a fresh desktop. Still, you may want to disable it to prevent other people from using your computer. Download the Official How-To Geek Trivia App for Windows 8 How to Banish Duplicate Photos with VisiPic How to Make Your Laptop Choose a Wired Connection Instead of Wireless

    Read the article

  • Increase Security by Enabling Two-Factor Authentication on Your Google Account

    - by Jason Fitzpatrick
    You can easily increase the security of your Google account by enabling two-factor authentication; flip it on today for a free security boost. It’s not a new feature but it’s a feature worth giving a second look. Watch the above video for a quick overview of Google’s two-factor authentication system. Essentially your mobile phone becomes the second authentication tool–you use your password + a code sent to your phone to log into your account. It’s a great way to easily increase the security of your Google account, it’s free, and you can set it so that you only have to validate your home computer once every 30 days. Google Two-Step Verification [via Google+] HTG Explains: When Do You Need to Update Your Drivers? How to Make the Kindle Fire Silk Browser *Actually* Fast! Amazon’s New Kindle Fire Tablet: the How-To Geek Review

    Read the article

  • login to account reverts to login screen, guest login ok

    - by Emil Hansen
    When I get to the login screen, I put my password, and I only get a black screen for 2 seconds, then we're back at the login screen. When I log in to the guest account everything works perfectly. What I've tried; I deleted .Xauthority as per instructions from elsewhere. No luck I removed 2 lines I put in .profile. No change Installed Gnome, but Unity, Unity 2D, various vers of gnome, same thing. according to other q/a's, this could be caused by lack of disk space. I have 80 gigs free, so I would assume it's not that if I Ctrl+Alt+F2 at login, I can login in to my account in the "terminal" with no problem. I would really appreciate any help to restore my account, and thank you very much beforehand

    Read the article

  • ISACA Webcast follow up: Managing High Risk Access and Compliance with a Platform Approach to Privileged Account Management

    - by Darin Pendergraft
    Last week we presented how Oracle Privileged Account Manager (OPAM) could be used to manage high risk, privileged accounts.  If you missed the webcast, here is a link to the replay: ISACA replay archive (NOTE: you will need to use Internet Explorer to view the archive) For those of you that did join us on the call, you will know that I only had a little bit of time for Q&A, and was only able to answer a few of the questions that came in.  So I wanted to devote this blog to answering the outstanding questions.  Here they are. 1. Can OPAM track admin or DBA activity details during a password check-out session? Oracle Audit Vault is monitoring these activities which can be correlated to check-out events. 2. How would OPAM handle simultaneous requests? OPAM can be configured to allow for shared passwords.  By default sharing is turned off. 3. How long are the passwords valid?  Are the admins required to manually check them in? Password expiration can be configured and set in the password policy according to your corporate standards.  You can specify if you want forced check-in or not. 4. Can 2-factor authentication be used with OPAM? Yes - 2-factor integration with OPAM is provided by integration with Oracle Access Manager, and Oracle Adaptive Access Manager. 5. How do you control access to OPAM to ensure that OPAM admins don't override the functionality to access privileged accounts? OPAM provides separation of duties by using Admin Roles to manage access to targets and privileged accounts and to control which operations admins can perform. 6. How and where are the passwords stored in OPAM? OPAM uses Oracle Platform Security Services (OPSS) Credential Store Framework (CSF) to securely store passwords.  This is the same system used by Oracle Applications. 7. Does OPAM support hierarchical/level based privileges?  Is the log maintained for independent review/audit? Yes. OPAM uses the Fusion Middleware (FMW) Audit Framework to store all OPAM related events in a dedicated audit database.  8. Does OPAM support emergency access in the case where approvers are not available until later? Yes.  OPAM can be configured to release a password under a "break-glass" emergency scenario. 9. Does OPAM work with AIX? Yes supported UNIX version are listed in the "certified component section" of the UNIX connector guide at:http://docs.oracle.com/cd/E22999_01/doc.111/e17694/intro.htm#autoId0 10. Does OPAM integrate with Sun Identity Manager? Yes.  OPAM can be integrated with SIM using the REST  APIs.  OPAM has direct integration with Oracle Identity Manager 11gR2. 11. Is OPAM available today and what does it cost? Yes.  OPAM is available now.  Ask your Oracle Account Manager for pricing. 12. Can OPAM be used in SAP environments? Yes, supported SAP version are listed in the "certified component section" of the SAP  connector guide here: http://docs.oracle.com/cd/E22999_01/doc.111/e25327/intro.htm#autoId0 13. How would this product integrate, if at all, with access to a particular field in the DB that need additional security such as SSN's? OPAM can work with DB Vault and DB Firewall to provide the fine grained access control for databases. 14. Is VM supported? As a deployment platform Oracle VM is supported. For further details about supported Virtualization Technologies see Oracle Fusion Middleware Supported System configurations here: http://www.oracle.com/technetwork/middleware/ias/downloads/fusion-certification-100350.html 15. Where did this (OPAM) technology come from? OPAM was built by Oracle Engineering. 16. Are all Linux flavors supported?  How about BSD? BSD is not supported. For supported UNIX version see the "certified component section" of the UNIX connector guide http://docs.oracle.com/cd/E22999_01/doc.111/e17694/intro.htm#autoId0 17. What happens if users don't check passwords in at the end of a work task? In OPAM a time frame can be defined how long a password can be checked out. The security admin can force a check-in at any given time. 18. is MySQL supported? Yes, supported DB version are listed in the "certified component section" of the DB connector guide here: http://docs.oracle.com/cd/E22999_01/doc.111/e28315/intro.htm#BABGJJHA 19. What happens when OPAM crashes and you need to use the password? OPAM can be configured for high availability, but if required, OPAM data can be backed up/recovered.  See the OPAM admin guide. 20. Is OPAM Standalone product or does it leverage other components from IDM? OPAM can be run stand-alone, but will also leverage other IDM components

    Read the article

  • force users to activate account before using service?

    - by fxuser
    i am not sure if this is the correct SE site to post this, but ill go on... So at the moment i force my users to activate their account upon registration if they want to sign in. I see some decent sites let their users sign in and use their features even their account is not activated and just show a message on top of the page letting them know that their account is not yet activated and that you need to activate it. So which practice is best? Should i stick with that i have or change it?

    Read the article

  • 6 Ways To Secure Your Dropbox Account

    - by Chris Hoffman
    Dropbox is a hugely popular cloud storage service beloved by many. Unfortunately, it’s had a history of security problems, ranging from compromised accounts to once allowing access to every Dropbox account without requiring a password for several hours. If you’re using Dropbox, there are a variety of ways you can secure your account against unauthorized access and protect your files even if someone does gain access to your account. Why Does 64-Bit Windows Need a Separate “Program Files (x86)” Folder? Why Your Android Phone Isn’t Getting Operating System Updates and What You Can Do About It How To Delete, Move, or Rename Locked Files in Windows

    Read the article

  • Google analytics/adwords account and leaking of private data

    - by Satellite
    I am frequently asked to log into clients google analytics and adwords accounts. If I forget to log out before visiting other google properties (google search, youtube etc), this leaves tracks of my views/searches etc, exposing my activities to the client. Summary: Client gives me access to their Google Analytics / AdWords account I log into clients Analytics account and do some stuff Then in another tab I perform some related google searches to solve some related issues Issues solved, I then close the Analytics tab I then visit google.com, perform some unrelated searches I then visit YouTube, view some unrelated videos All Web and YouTube searches are recorded in clients google account, thus leaking potentially sensitive data Even assuming that I remember to log out correctly at step 4 (as I do 95% of the time), anything I do at step 3 is exposed to the client. I would be surprised if this is not a very common issue. I'm looking for a technical solution to ensure that this can never happen. Any ideas?

    Read the article

  • Drop in service for account management, authentication, identity?

    - by Mike Repass
    I'm building an Android app and associated set of web services for uploading/downloading data. I need a basic (no frills) solution for account management (register, login, logout, verify credentials/token). What open source / third party solutions exist for this scenario? I need: create a new account db based on a salt simple web service to create a new account simple web service to authenticate supplied credentials and return some sort of token That's it, I can get by without 'fancy' email activation or password reset for the time being. Are there off-the-shelf components for this? Should I just use a 'blank' django or rails app to get this done? Seems crazy for everyone to be doing CREATE TABLE user_accounts ... Thoughts? Thank you.

    Read the article

  • Configuration management in support of scientific computing

    - by Sharpie
    For the past few years I have been involved with developing and maintaining a system for forecasting near-shore waves. Our team has just received a significant grant for further development and as a result we are taking the opportunity to refactor many components of the old system. We will also be receiving a new server to run the model and so I am taking this opportunity to consider how we set up the system. Basically, the steps that need to happen are: Some standard packages and libraries such as compilers and databases need to be downloaded and installed. Some custom scientific models need to be downloaded and compiled from source as they are not commonly provided as packages. New users need to be created to manage the databases and run the models. A suite of scripts that manage model-database interaction needs to be checked out from source code control and installed. Crontabs need to be set up to run the scripts at regular intervals in order to generate forecasts. I have been pondering applying tools such as Puppet, Capistrano or Fabric to automate the above steps. It seems perfectly possible to implement most of the above functionality except there are a couple usage cases that I am wondering about: During my preliminary research, I have found few examples and little discussion on how to use these systems to abstract and automate the process of building custom components from source. We may have to deploy on machines that are isolated from the Internet- i.e. all configuration and set up files will have to come in on a USB key that can be inserted into a terminal that can connect to the server that will run the models. I see this as an opportunity to learn a new tool that will help me automate my workflow, but I am unsure which tool I should start with. If any member of the community could suggest a tool that would support the above workflow and the issues specific to scientific computing, I would be very grateful. Our production server will be running Linux, but support for OS X would be a bonus as it would allow the development team to setup test installations outside of VirtualBox.

    Read the article

  • Configuration management in support of scientific computing

    - by Sharpie
    For the past few years I have been involved with developing and maintaining a system for forecasting near-shore waves. Our team has just received a significant grant for further development and as a result we are taking the opportunity to refactor many components of the old system. We will also be receiving a new server to run the model and so I am taking this opportunity to consider how we set up the system. Basically, the steps that need to happen are: Some standard packages and libraries such as compilers and databases need to be downloaded and installed. Some custom scientific models need to be downloaded and compiled from source as they are not commonly provided as packages. New users need to be created to manage the databases and run the models. A suite of scripts that manage model-database interaction needs to be checked out from source code control and installed. Crontabs need to be set up to run the scripts at regular intervals in order to generate forecasts. I have been pondering applying tools such as Puppet, Capistrano or Fabric to automate the above steps. It seems perfectly possible to implement most of the above functionality except there are a couple usage cases that I am wondering about: During my preliminary research, I have found few examples and little discussion on how to use these systems to abstract and automate the process of building custom components from source. We may have to deploy on machines that are isolated from the Internet- i.e. all configuration and set up files will have to come in on a USB key that can be inserted into a terminal that can connect to the server that will run the models. I see this as an opportunity to learn a new tool that will help me automate my workflow, but I am unsure which tool I should start with. If any member of the community could suggest a tool that would support the above workflow and the issues specific to scientific computing, I would be very grateful. Our production server will be running Linux, but support for OS X would be a bonus as it would allow the development team to setup test installations outside of VirtualBox.

    Read the article

  • Tokyo Tyrant ulog / update log management.

    - by Nathan Milford
    I'm testing Tokyo Tyrant in a master-master setup and have found the ulog grows out of control and locks up the disk. At first I found the -ulim option useful and limited the logfile size, however it simply rolls over to a new log, leaving the old ones to clutter up the partition. I suppose I'll write a shell script that will delete ulogs older than X, once I find out how far back Tokyo Tyrant needs in the update log in order to failover. Does anyone have any experience with this Tokyo Tyrant? Do you have a feel (acknowledging that every install is different based on what is being stored) for the optimal ulog size vs how far back a Tokyo Tyrant instance needs to look in the ulog to assume master status? Thanks, nathan

    Read the article

  • Windows user account just for accessing network shares on a Windows 7 machine

    - by Paulo
    I would like my Xbox (Xbmc) to access my Windows 7 shares without having Guest accounts enabled and without using my Administrator account login details. I have tried making it an account called Xbox and this works fine but the Xbox account appears on the login page for Windows. Is there a way to create an account that is purely for accessing shares without it appearing as a user account????

    Read the article

  • SNMP based network discovery (switches), device (ports on switches) power management

    - by SaM
    In a enterprise network, what would be the right way to generate a list of switches (SNMP managed) Is it reasonable to ask the organization to supply a list such as this: Switch name IP Address of switch Location SNMP community strings Or are there standard ways to run discovery scans - UDP broadcasts? After having generated a repository such as the above; given a single switch, how to query it for the list of all devices attached to it? Finally, how to selectively power down/power up ports? (remotely - using SNMP) Platform is going to be .NET based (C#) and the library being used is SharpSNMP

    Read the article

  • Network Management Cable Labeling Techniques and their alternatives [closed]

    - by Alex
    Possible Duplicate: What is the most effective solution you used to label cables? Yes i know there are a lot of howtos and already answered questions about this topic, like this one: How do you organise the cables in your racks? Currently i am searching the web for different techniques (alternatives) for labeling the cables at the server racks and/or data centers. Unfortunately i do not have any experience with labeling/documentation of network cables in a large scale. As far as I could lookup by now the current labeling techniques are coloring and a self defined print-labeling technique (numbering, text) maybe also according to a standard which are usually used. I want to know if QR, RFID (ok RFID in a data center would be stupid due to the radio frequency wouldn't it be?), Barcodes or similar (??) have already been used by some administrators or why they did not consider such techniques at all? Too complicated (with QR scanner etc..) if you are in front of the cables and want to get quick feedback for what the cable is? What alternatives are out there? Advantages/Disadvantages? Best-Practice? I would appreciate any help on this topic, thank you! Regards, Alex

    Read the article

  • Digital Asset Management, iPhoto / Aperture server... alternative

    - by Sisyphus
    Afternoon, Clients, 10 : All Apples running either Leopard or Snow Leopard Server : Snow Leopard server, (and I have a old Dell Poweredge 650 at home running Gentoo 2.6, if anybody as a Linux solution). The situation: I work in small design company with 8 people, at present we are looking to consolidate all our image files onto one location, at present we each use our preferred single user DAM solution, be it, Adobe Bridge, iPhoto/Aperture (some don't bother at all) The filetypes commonly used are .psd, .pdf, .eps, .tiff, .jpg and RAW image files. Ideally what is needed: Centralised on one server, but allows us to search via spotlight (not essential, but would be nice) Include searchable metadata information such as date, location, title Open-source or as low cost as possibly Allow simultaneous users to import files So far, I have looked at a few open source DAM, systems, such as Razuna, Gallery (not strictly DAM), ResourceSpace, Notre-DAM, while these are brilliant and open-source, they don't integrate as smoothly with the Desktop as iPhoto and aperture. For iPhoto and aperture, I have tried creating a Shared library on the server (a tad laggy), and also using a drive with no permissions, put a library and letting each client read from it, however if they want to put images onto the library only, it's only supports one user at a time writing to the library... Any ideas what could fulfill our needs? Or is it time to bite the bullet for FinalCut Server? Thanks in advance.

    Read the article

  • Change Management Software

    - by Andrew
    I manage an 80,000 user CIS application written in Uniface. Every form in the application, and many of its processes, are represented by .frm files. We have hundreds of these files and 5 instances of the application. Instances include multiple production installations which must be kept sync'd. We do not get MD5 from our vendor for files that are released to us as patches. We have been using a spreadsheet to track changes, but this is far from ideal. Is there a commercial application that can be purchased that will allow us to track changes to the instances? Thank you all! EDIT: Patches are released as zip files with either FRM files in them or SQL files or a mix of both. SQL files will contain statements that need to be run in Oracle. Patches are also assigned unique patch numbers.

    Read the article

  • Unix Password Management Keyring

    - by Phil
    I am looking for a password manager for a command-line Unix environment. So far all I can find are keyring applications for Windows, Linux, and Mac. But no command-line Unix interfaces. My main goal is to be able to access a password keyring through an SSH connection to a machine that has no graphical user interface. If there are no good unix password keyrings out there, what would be a better way to store personal passwords in a central location?

    Read the article

  • Server Room Protocols/Server Room Management

    - by Matthew E
    Hi, I'm new to this site but have found the articles and feedback very useful. We have a Server Room which our Organisation owns and controls, yet there are several thirs party companies that have open access to this room. As such, we have been asked to put together a protocol paper that stipulates the standards that we expect to be adhered to when working in this room. Other than the monitoring of UPS loads, Air Cooling functionality, alarm systems etc, does any one have any guidance on the kind of issues that need to be documented to make this protocol all encompassing? I'm thinking along the lines of not leaving cardboard or other combustibles in the room, not having food and drink in the room, not altering the fabric of the building by drilling through walls etc? Many thanks in advance for any guidance provided.

    Read the article

< Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >