Search Results

Search found 48844 results on 1954 pages for 'first steps'.

Page 242/1954 | < Previous Page | 238 239 240 241 242 243 244 245 246 247 248 249  | Next Page >

  • Digikam: What's the problem?

    - by Unapiedra
    I installed Digikam by using the Philip5-PPA. When I run it I get the error below. This is by running it through gdb: Starting program: /usr/bin/digikam /usr/bin/digikam: error while loading shared libraries: libcxcore.so.2.1: cannot open shared object file: No such file or directory [Inferior 1 (process 29894) exited with code 0177] What should I do to find the error and fix it? I can see that somehow libcxcore.so.2.1 is wanted but not found. Is this an error of the PPA, or can I simply point it in the right direction? Can I raise an issue with the PPA creator through launchpad? Some next steps would be quite helpful.

    Read the article

  • Text reverses on remote gnome session

    - by Andrew Stern
    I have two computers running 10.4 . The first machine is a wired desktop with sshd. The second is a wifi connected laptop with the ssh client. When I use my laptop to bring up a remote gnome session to my desktop all the text gets reversed. Steps: 1) login as a user on the laptop to activate the wifi with a stored key. 2) goto a console Ctrl-Alt F1 3) do a xterm -- :1 to bring up a blank graphic session 4) ssh -Y user@desktopmachine gnome-session This shows reversed text and messes up the keyboard so I can't type

    Read the article

  • Why does starting a program log me out?

    - by Bruce McKean
    I'm not a computer nerd but have been able to fix Ubuntu/Linux problems in the pass two years with a Google search but this on has me stumped. I upgraded to 12.04 about thirty days ago and all was well except that every time I tried to load KeePassX it would go to the login screen. I installed KeePass2 and all seemed to work. Last week I tried to load Bibble5 (Raw photo editor) and it would try to load and then back to the login screen. After a few days I gave up and downloaded Corel After Shot Pro (Corel's now owns Bibble5) and it has the same problem. Could someone please head me into the steps I need to follow to find out what is the cause? I'm interested in learning more about the Linux system. How to correct any future problems like this? Computer Specification: Processor : 8x Intel(R) Core(TM) i7-2600 CPU @ 3.40GHz Memory : 8155MB (1092MB used) Operating System : Ubuntu 12.04 LTS Graphics Card : GeForce GT 520/PCIe/SSE2

    Read the article

  • On Adopting the Mindset of an Enterprise DBA

    Although many of the important tasks a DBA has to perform should be done 'by hand', keying in commands or using SSMS, the canny DBA with a heavy workload will always have an eye to automating routine tasks wherever possible, or using a tool. 24% of devs don’t use database source control – make sure you aren’t one of themVersion control is standard for application code, but databases haven’t caught up. So what steps can you take to put your SQL databases under version control? Why should you start doing it? Read more to find out…

    Read the article

  • DBA Best Practices - A Blog Series: Episode 1 - Backups

    - by Argenis
      This blog post is part of the DBA Best Practices series, on which various topics of concern for daily database operations are discussed. Your feedback and comments are very much welcome, so please drop by the comments section and be sure to leave your thoughts on the subject. Morning Coffee When I was a DBA, the first thing I did when I sat down at my desk at work was checking that all backups have completed successfully. It really was more of a ritual, since I had a dual system in place to check for backup completion: 1) the scheduled agent jobs to back up the databases were set to alert the NOC in failure, and 2) I had a script run from a central server every so often to check for any backup failures. Why the redundancy, you might ask. Well, for one I was once bitten by the fact that database mail doesn't work 100% of the time. Potential causes for failure include issues on the SMTP box that relays your server email, firewall problems, DNS issues, etc. And so to be sure that my backups completed fine, I needed to rely on a mechanism other than having the servers do the taking - I needed to interrogate the servers and ask each one if an issue had occurred. This is why I had a script run every so often. Some of you might have monitoring tools in place like Microsoft System Center Operations Manager (SCOM) or similar 3rd party products that would track all these things for you. But at that moment, we had no resort but to write our own Powershell scripts to do it. Now it goes without saying that if you don't have backups in place, you might as well find another career. Your most sacred job as a DBA is to protect the data from a disaster, and only properly safeguarded backups can offer you peace of mind here. "But, we have a cluster...we don't need backups" Sadly I've heard this line more than I would have liked to. You need to understand that a cluster is comprised of shared storage, and that is precisely your single point of failure. A cluster will protect you from an issue at the Operating System level, and also under an outage of any SQL-related service or dependent devices. But it will most definitely NOT protect you against corruption, nor will it protect you against somebody deleting data from a table - accidentally or otherwise. Backup, fine. How often do I take a backup? The answer to this is something you will hear frequently when working with databases: it depends. What does it depend on? For one, you need to understand how much data your business is willing to lose. This is what's called Recovery Point Objective, or RPO. If you don't know how much data your business is willing to lose, you need to have an honest and realistic conversation about data loss expectations with your customers, internal or external. From my experience, their first answer to the question "how much data loss can you withstand?" will be "zero". In that case, you will need to explain how zero data loss is very difficult and very costly to achieve, even in today's computing environments. Do you want to go ahead and take full backups of all your databases every hour, or even every day? Probably not, because of the impact that taking a full backup can have on a system. That's what differential and transaction log backups are for. Have I answered the question of how often to take a backup? No, and I did that on purpose. You need to think about how much time you have to recover from any event that requires you to restore your databases. This is what's called Recovery Time Objective. Again, if you go ask your customer how long of an outage they can withstand, at first you will get a completely unrealistic number - and that will be your starting point for discussing a solution that is cost effective. The point that I'm trying to get across is that you need to have a plan. This plan needs to be practiced, and tested. Like a football playbook, you need to rehearse the moves you'll perform when the time comes. How often is up to you, and the objective is that you feel better about yourself and the steps you need to follow when emergency strikes. A backup is nothing more than an untested restore Backups are files. Files are prone to corruption. Put those two together and realize how you feel about those backups sitting on that network drive. When was the last time you restored any of those? Restoring your backups on another box - that, by the way, doesn't have to match the specs of your production server - will give you two things: 1) peace of mind, because now you know that your backups are good and 2) a place to offload your consistency checks with DBCC CHECKDB or any of the other DBCC commands like CHECKTABLE or CHECKCATALOG. This is a great strategy for VLDBs that cannot withstand the additional load created by the consistency checks. If you choose to offload your consistency checks to another server though, be sure to run DBCC CHECKDB WITH PHYSICALONLY on the production server, and if you're using SQL Server 2008 R2 SP1 CU4 and above, be sure to enable traceflags 2562 and/or 2549, which will speed up the PHYSICALONLY checks further - you can read more about this enhancement here. Back to the "How Often" question for a second. If you have the disk, and the network latency, and the system resources to do so, why not backup the transaction log often? As in, every 5 minutes, or even less than that? There's not much downside to doing it, as you will have to clear the log with a backup sooner than later, lest you risk running out space on your tlog, or even your drive. The one drawback to this approach is that you will have more files to deal with at restore time, and processing each file will add a bit of extra time to the entire process. But it might be worth that time knowing that you minimized the amount of data lost. Again, test your plan to make sure that it matches your particular needs. Where to back up to? Network share? Locally? SAN volume? This is another topic where everybody has a favorite choice. So, I'll stick to mentioning what I like to do and what I consider to be the best practice in this regard. I like to backup to a SAN volume, i.e., a drive that actually lives in the SAN, and can be easily attached to another server in a pinch, saving you valuable time - you wouldn't need to restore files on the network (slow) or pull out drives out a dead server (been there, done that, it’s also slow!). The key is to have a copy of those backup files made quickly, and, if at all possible, to a remote target on a different datacenter - or even the cloud. There are plenty of solutions out there that can help you put such a solution together. That right there is the first step towards a practical Disaster Recovery plan. But there's much more to DR, and that's material for a different blog post in this series.

    Read the article

  • Why is the Touchpad tab missing?

    - by U47
    So my Dell Vostro 3360 came out of the box with an Alps touchpad which was detected as PS/2. Through these steps, I was able to get as far as it appearing to be recognized as a touchpad. Howevver, I do not get the Touchpad tab in the Mouse/Touchpad settings. Two-finger scrolling appears to work, and the Fn-F3 does disable the touchpad, but the settings are so brutal that it is almost impossible to work with. Very difficult to scroll at all accurately. I'm hoping there's some settings which can be tweaked which will help with this. Any ideas? My xinput listing is as follows: ? Virtual core pointer id=2 [master pointer (3)] ? ? Virtual core XTEST pointer id=4 [slave pointer (2)] ? ? GlidePoint Virtual Touchpad id=12 [slave pointer (2)] Thanks,

    Read the article

  • How to register your Office365 30 days trial

    - by ybbest
    In this post, I’d like to show you how to register your Office365 30 days trial. It is extremely easy and the steps are as below: 1. Go to Office365 home page 2. Click on Free Trial and choose the options depending on your business requirement, I will choose Plan E3. 3. Fill in the details and create your trial 4. Choose your access account then click Save and continue. 5. Here is landing page for your Office 365 account.

    Read the article

  • Upgrade Ubuntu Server from 11.04 to 11.10 without internet connection

    - by Tony Marciano
    We have application software that really likes Ubuntu Server 11.10. I need to upgrade several 11.04 servers to this version. Two questions: The servers that need to be upgraded do not have Internet access in our datacenter due to security reasons. I need to download the updates/upgrades to a secure system and then transfer them to the datacenter servers for installation. Is anyone aware of the steps involved? How/where do I get the 11.10 updates from? I don't see an option on the Ubuntu site for downloading specific versions of the OS and/or upgrades.

    Read the article

  • Ethernet on an Acer Aspire One D255E not working

    - by Dustin
    Ok very frustrated here, but hopefully can find an answer here. I have an Acer Aspire One netbook D255E. I installed 10.04 desktop edition via live USB, only to find out ethernet is not connecting. The thing is, I had installed previously 10.04 thru wubi as 30GB, but wanted more so did this install. With the Wubi install it worked great internet worked added LibO, mint menu ect. So I know it could work, and it works on Win 7(what little GB is left). I have ethernet enabled, and gave myself all permissions, but still not working. Thank you for any help, and if you do more details in the steps i need to take the better :).

    Read the article

  • Touchpad in Sony Vaio E14 - cannot click with a finger and drag with another - Ubuntu 12.04

    - by Nabeel
    I had the following issues with my Sony Vaio E14115's touchpad on Ubuntu 12.04: Right click not working Vertical scrolling not working Then I followed these steps to fix it: Downloaded a patch for Psmouse sudo apt-get install dkms build-essential cd ~/Desktop tar jxvf psmouse-3.2.0-24-generic-pae.tar.bz2 sudo mv psmouse-3.2.0-24-generic-pae /usr/src cd /usr/src sudo chmod -R a+rx psmouse-3.2.0-24-generic-pae sudo dkms add -m psmouse -v 3.2.0-24-generic-pae sudo dkms build -m psmouse -v 3.2.0-24-generic-pae sudo dkms install -m psmouse -v 3.2.0-24-generic-pae sudo modprobe -r psmouse sudo modprobe psmouse This fixed me the problems of Rightclick and vertical scrolling. But I still have an issue of not able to click and drag ie Clicking the leftmouse button and dragging with another finger. It doesn't move! Maybe it doesn't detect the 2nd finger. When I check the capabilities of my Synaptics touchpad by using xinput list-props "SynPS/2 Synaptics TouchPad"|grep Capabilities I get the following output: Synaptics Capabilities (304): 1, 1, 1, 1, 1, 1, 1 Please help me out.

    Read the article

  • Quantitfying a cost for a software project

    - by The Elite Gentleman
    Disclaimer: I didn't know exactly where to put this question. If you feel that this question is not suitable for Programmers @ StackExchange, feel free to migrate it. Background: Broadening my last question, there is a request for tender for a software system that's open and I have decided to take it on. I am a software developer & engineer by profession and, in this tender process, I have to put on the pricing for my bid. I have been provided a documentation consisting of functional and non-functional requirements only. I have to put a project manager's cap on and think of all aspects, e.g. cost for implementation for the project, resources needed, etc. My question is: Is there a project framework that I can follow that breaks the project cycle into steps and corresponding cost aspect or how would I go about best calculating/approximating the cost for the project?

    Read the article

  • How exactly does XNA's SpriteBatch work?

    - by David Gouveia
    To be more precise, if I needed to recreate this functionality from scratch in another API (e.g. in OpenGL) what would it need to be capable of doing? I do have a general idea of some of the steps, such as how it prepares an orthographic projection matrix and creates a quad for each draw call. I'm not too familiar, however, with the batching process itself. Are all quads stored in the same vertex buffer? Does it need an index buffer? How are different textures handled? If possible I'd be grateful if you could guide me through the process from when SpriteBatch.Begin() is called until SpriteBatch.End(), at least when using the default Deferred mode.

    Read the article

  • Unity 3D coding language, C# or JavaScript [on hold]

    - by hemantchhabra
    Hello to the gaming community. I am a budding game designer, learning to code for the first time in my life. I did learned c++ in school, 8 years back, so I sort of understand the logic when people are doing coding and I can suggest them the right route also, but to an extent I can't code. I am beginning to learn coding for Unity 3D. Which one do you suggest is more versatile and easier to work on for future, because I am a game designer not a coder, I would do coding until I don't have anyone else to code for me. It should be easy and fast to learn, functional and universal to apply, and innovative at the same time. C# or JavaScript ? Thank you for your time Ps- if you could suggest me steps to learn and tutorials to look for, that would be just awesome.

    Read the article

  • SOA Suite Integration: Part 3: Loading files

    - by Anthony Shorten
    One of the most common scenarios in SOA Integration is the loading of a file into the product from an external source. In Oracle SOA Suite there is a File Adapter that can process many file types into your BPEL process. For this example I will use the File Adapter to load a file of user and emails to update the user object within the Oracle Utilities Application Framework. Remember you can repeat this process with other objects and other file types. Again I am illustrating the ease of integration. The first thing is to create an empty BPEL process that will hold our flow. In Oracle JDeveloper this can be achieved by specifying the Define Service Later template (as other templates have predefined inputs and outputs and in this case we want to specify those). So I will create simpleFileLoad process to house our process. You will start with an empty canvas so you need to first specify the load part of the process using the File Adapter. Select the File Adapter from the Component Palette under BPEL Services and drag and drop it to the left side Partner Links (left is input). You name the Service. In this case I chose LoadFile. Press Next. We will define the interface as part of the wizard so select Define from operation and schema (specified later). Press Next. We are going to choose Read File to denote that we will read the file and specify the default Operation Name as Read. Press Next. The next step is to tell the Adapter the location of the files, how to process them and what to do with them after they have been processed. I am using hardcoded locations in this example but you can have logical locations as well. Press Next. I am now going to tell the adapter how to recognize the files I want to load. In my case I am using CSV files and more importantly I am tell the adapter to run the process for each record in the file it encounters. Press Next. Now, I tell the adapter how often I want to poll for the files. I have taken the defaults. Press Next. At this stage I have no explanation of the format of the input. So I am going to invoke the Native Format Wizard which will guide me through the process of creating the file input format. Clicking the purple cog icon will start the wizard. After an introduction screen (not shown), you specify the format of the input file. The File Adapter supports multiple format types. For this example, I will use Delimited as I am going to load a CSV file. Press Next. The best way for the wizard to work is with a sample. I have a sample file and the wizard will ask how much of the file to use as a template. I will use the defaults. Note: If you are using a language that has other languages other than US-ASCII, it is at this point you specify the character set to use.  Press Next. The sample contains multiple instances of a single record type. The wizard supports complex types as well. We will use the appropriate setting for our file. Press Next. You have to specify the file element and the record element. This will be used by the input wizard to translate the CSV data into an XML structure (this will make sense later). I am using LoadUsers as my file delimiter (root element) and User Record as my record root element. Press Next. As the file is CSV the delimiter is "," so I will also specify that the End Of Line (EOL) indicator indicates the end of a record. Press Next. Up until this point your have not given the columns their names. In my case my sample includes the column names in the first record. This is not always the case but you can specify the names and formats of columns in this dialog (not shown). Press Next. The wizard now generates the schema for the input file. You can specify a name for the schema. I have used userupdate.xsd. We want to verify the schema so press Test. You can test the schema by specifying an input sample. and pressing the green play button. You will see the delimiters you specified earlier for the file and the records. Press Ok to continue. A confirmation screen will be displayed showing you the location of the schema in your project. Press Finish to return to the File Adapter configuration. You will now see the schema and elements prepopulated from the wizard. Press Next. The File Adapter configuration is now complete. Press Finish. Now you need to receive the input from the LoadFile component so we need to place a Receive node in the BPEL process by drag and dropping the Receive component from the Component Palette under BPEL Constructs onto the BPEL process. We link the receive process with the LoadFile component by dragging the left most connect node of the Receive node to the LoadFile component. Once the link is established you need to name the Receive node appropriately and as in the post of the last part of this series you need to generate input variables for the BPEL process to hold the input records in. You need to now add the product Web Service. The process is the same as described in the post of the last part of this series. You drop the Web Service BPEL Service onto the right side of the process and fill in the details of the WSDL URL . You also have to add an Invoke node to call the service and generate the input and outputs variables for the call in the Invoke node. Now, to get the inputs from File to the service. You have to use a Transform (you can use an Assign action but a Transform action is more flexible). You drag and drop the Transform component from the Component Palette under Oracle Extensions and place it between the Receive and Invoke nodes. We name the Transform Node, Mapper File and associate the source of the mapping the schema from the Receive node and the output will be the input variable from the Invoke node. We now build the transform. We first map the user and email attributes by drag and drop the elements from the left to the right. The reason we needed to use the transform is that we will be telling the AS-User service that we want to issue an update action. Remember when we registered the service we actually used Read as the default. If we do not otherwise inform the service to use the Update action it will use the Read action instead (which is not desired). To specify the update action you need to click on the transactionType node on the right and select Set Text to set the action. You need to specify the transactionType of UPD (for update). The mapping is now complete. The final BPEL process is ready for deployment. You then deploy the BPEL process to the server and to test the service by simply dropping a file, in the same pattern/name as you specified, in the directory you specified in the File Adapter. You will see each record as a separate instance entry in the Fusion Middleware Control console. You can now load files into the product. You can repeat this process for each type of file to process. While this was a simple example it illustrates the method of loading data can be achieved using SOA Suite in conjunction with our products.

    Read the article

  • Cloud Integration White Paper - Now Available

    - by Bruce Tierney
    Interested in expanding your existing application infrastructure to integrate with cloud applications?  Download the new Oracle White Paper "Cloud Integration - A Comprehensive Solution" to learn not just about connectivity but the other key aspects of successful cloud integration. The paper includes three technical examples of cloud integration with Oracle Fusion Applications, Saleforce, and Workday and follows with the importance of taking a comprehensive approach to also include service aggregation, service virtualization, cloud security considerations and the benefit of maintaining a unified approach to monitoring and management despite an increasingly distributed hybrid infrastructure. To keep the integration architecture from being defined "accidentally" as new business units subscribe to additional cloud vendors outside the participation of IT, a discussion on the "Accidental SOA Cloud Architecture" is included: As shown in the table of contents below, the white paper provides a combination of high-level awareness about key considerations as well as a technical deep dive of the steps needed for cloud integration connectivity: Hope you find the White Paper valuable.  Please download from the following link

    Read the article

  • Webcast: Flow Manufacturing Work Order-less Completion

    - by ChristineS-Oracle
    Webcast: Flow Manufacturing Work Order-less Completion Date: August 27, 2014 at 11:00 am ET, 10:00 am CT, 9:00 am MT, 8:00 am PT, 8:30 pm, India Time (Mumbai, GMT+05:30) This advisor webcast is intended for technical and functional users who want to understand the Flow Manufacturing Work Order-less Completion and its Transaction types. This presentation will include the required setups and provide details of the business process. Topics will include: Overview of Flow Manufacturing and Integration with Work In Process Overview of Work Order-less Completion Basic Setup Steps Transactions performed in Work Order-less Completion form Details & Registration: Doc ID 1906749.1

    Read the article

  • Understanding Data Binding for Windows Phone 7

    - by nikhil
    I want to develop a simple app for the Windows Phone 7 platform. It's basically a vocabulary based game that involves the user moving word tiles from one area to another to score points. I want to know what is the best way of tying the UI to the game's backend? I saw the Windows Phone 7 jumpstart videos, there they touch up on Data Binding but don't really go into any depth. I'm a newbie and don't have any experience with designing the architecture for a phone app, It'd be great if someone could explain what steps I should be taking or guide me to a resource from where I could learn more.

    Read the article

  • New White Papers Available

    - by mattande
    New Master Data Services white papers are now available on MSDN. For an application-agnostic overview of Master Data Management, see Organizational Approaches to Master Data Management . For the steps needed to configure Master Data Services to work with a SharePoint workflow, see SharePoint Workflow Integration with Master Data Services . Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!...(read more)

    Read the article

  • CRM Goes to School, Supports Enrollment Growth

    - by Tony Berk
    At Post University in Waterbury, CT, the focus is on the student. Generally, the first interaction from a potential student is a lead, which can come from a variety of sources. Any delay in following up with the interested student (the lead) affects the conversion success rate, i.e., the likelihood of enrollment. By implementing Oracle CRM On Demand, Post University automated the admissions process so the admissions counselors are in direct contact with the students and eliminated many manual steps. The admissions and marketing teams, as well as the students, benefit from the new streamlined process. Up next, Post University, plans to increase the efficiency of the student retention processes with the expansion of Oracle CRM On Demand. Take a look at the video to learn more about Post University's Oracle CRM On Demand implementation: Congrats to Post University, and Apex IT, their implementation partner, on the successful implementation!

    Read the article

  • How to SET TIMING ON for parallel upgrades to 12c?

    - by Mike Dietrich
    Have you asked yourself how to get timings in an Oracle Database 12c upgrade for all statements? When you run the parallel upgrade via catctl.pl, the parallel upgrade Perl driving script in Oracle Database 12c, you may also want to get timings written in your logfile during execution. As catctl.pl does not offer an option yet the best way to achieve this is to edit the catupses.sql script in $ORACLE/rdbms/admin as this script will get called all time over and over again throughout all steps of theupgrade run. Just add these lines marked in RED to catupses.sql and start your upgrade: Rem =============================================Rem Call Common session settingsRem =============================================@@catpses.sql Rem =============================================Rem  Set Timing On during the UpgradeRem =============================================SET TIMING ON; Rem =============================================Rem Turn off PL/SQL event used by APPSRem =============================================ALTER SESSION SET EVENTS='10933 trace name context off'; -Mike PS: This may become the default in a future patch set

    Read the article

  • Crashing trying to install Ubuntu 12.04 LTS

    - by Daniel Evans
    Hardware: Dell Inspiron 1545 Steps are as follows: 1. Insert 64 bit Ubuntu 12.04 disc 2. Boot computer Output is as follows: error: unexpectedly disconnected from boot status daemon Generating locales... en_US.UTF-8... done Generation complete. MEMORY-ERROR: glib-compile-schemas[569]: GSlice: assertion failed: aligned_memory == (gpointer) addr Aborted pwconv: failed to change the mode of /etc/passwd- to 0600 MEMORY-ERROR: [996]: GSlice: assertion failed: aligned_memory == (gpointer) addr MEMORY-ERROR: glib-compile-scehmas[1034]: GSlice: assertion failed: aligned_memory == gpointer) addr Aborted /usr/lib/python2.7/dist-packages/LanguageSelector/LocaleInfo.py:256: UserWarning: Failed to connect to socket /var/run/dbus/system_bus_socket: No such file or directory warnings.warn(msg.args[0].encode('UTF-8')) Using CD-ROM mount point /cdrom ... etc etc... End up at a prompt line ubuntu@ubuntu:~$

    Read the article

  • Java Developers: Open-source Modules, Great Tools, Opportunity.

    - by Paul Sorensen
    The role of Java developer may just be better than ever. An excellent article in Java Magazine discusses the availability of web-based tools that help development teams more effectively manage their projects and modules. If you are a Java developer you should definitely read this article. I especially like the Expert Opinions scattered throughout the article. These highlight real-world usage of the latest and greatest development tools.  As you consider steps to move your career forward, consider Java certification. Oracle has over 15 unique Java certification credentials available. The process of becoming certified in Java and preparing for your exams will require you to study, learn and practice (code). All of this activity will help you sharpen your skills and increase your working knowledge of Java - making you a better developer and more valuable member of your team. You can use the Certification Finder on the Oracle certification homepage to find a Java certification that is right for you. Thanks! 

    Read the article

  • Top 10 Oracle Solaris How To Articles

    - by Glynn Foster
    While generating new technical content for Oracle Solaris 11 is one of our higher priorities here at Oracle, it's always fun to have a look at some web stats to see what existing published content is popular among our audience. So here's the top ten as voted by your browsers. Interestingly it's a great mix of technologies. What's your favourite? Let us know! RankHow To Articles 1.Taking your first steps with Oracle Solaris 11 2.How to get started creating Zones on Oracle Solaris 11 3.How to script Oracle Solaris 11 Zone creation for a network in a box configuration 4.How to configure Oracle Solaris 11 using the sysconfig command 5.How to update Oracle Solaris 11 systems using Support Repository Updates 6.How to perform system archival and recovery with Oracle Solaris 11 7.Introducing the basics of IPS on Oracle Solaris 11 8.How to update to Oracle Solaris 11.1 using IPS 9.How to set up Automated Installer services on Oracle Solaris 11 10.How to live install from Oracle Solaris 10 to Oracle Solaris 11 11/11

    Read the article

  • Total Cloud Control for Systems - Webcast on April 12, 2012 (18:00 CET/5pm UK)

    - by Javier Puerta
    Total Cloud Control Keeps Getting BetterJoin Oracle Vice President of Systems Management Steve Wilson and a panel of Oracle executives to find out how your enterprise cloud can achieve 10x improved performance and 12x operational agility. Only Oracle Enterprise Manager Ops Center 12c allows you to: Accelerate mission-critical cloud deployment Unleash the power of Solaris 11, the first cloud OS Simplify Oracle engineered systems management You’ll also get a chance to have your questions answered by Oracle product experts and dive deeper into the technology by viewing our demos that trace the steps companies like yours take as they transition to a private cloud environment. Register today for this interactive keynote and panel discussion. Agenda 18:00 a.m. CET (5pm UK) Keynote: Total Cloud Control for Systems 18:45 a.m. CET (5:45 pm UK) Panel Discussion with Oracle Hardware, Software, and Support Executives 19:15 a.m. CET (6:15 UK) Demo Series: A Step-by-Step Journey to Enterprise Clouds

    Read the article

  • My Generation Drivers DropdownList Empty

    - by hmloo
    I just installed MyGeneration 1.3.1 on my Windows Vista, and when I try to setup the default settings there are no drivers to select from. but it's OK in my Windows XP machine. At last I got the answers from search engine, it is caused by multi .Net Frameworks coexist and MyMeta.dll is registered by the .Net v2.0 while it should be registered with the .Net v4.0. So we have to manually register the dll. Please follow these steps: 1. Run Visual Studio Command Prompt (2010) with "Run as Administrator". 2. Enter the command "regasm.exe "C:\Program Files\MyGeneration13\MyMeta.dll" /tlb: MyMeta.tlb" 3. Exit Command Prompt when successfully registered. 4. Run MyGeneration and the proplem should be disapear. Hope this helps. Thanks for reading.

    Read the article

< Previous Page | 238 239 240 241 242 243 244 245 246 247 248 249  | Next Page >