Search Results

Search found 4001 results on 161 pages for 'operating'.

Page 105/161 | < Previous Page | 101 102 103 104 105 106 107 108 109 110 111 112  | Next Page >

  • How to Transfer All Your Information to a New PS3

    - by Justin Garrison
    The PlayStation 3 now costs half the price, has double the storage, and uses half the power. If you need another reason to upgrade, Sony also makes it easy to transfer all of your information to a new console. Transferring all of your games, data, and settings is easier than ever, and all you need is an ethernet cable. Read on as we walk you through the whole process of setting up your new PS3 and wiping all your information off the old one. Latest Features How-To Geek ETC Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions Hack Apart a Highlighter to Create UV-Reactive Flowers [Science] Add a “Textmate Style” Lightweight Text Editor with Dropbox Syncing to Chrome and Iron Is the Forcefield Really On or Not? [Star Wars Parody Video] Google Updates Picasa Web Albums; Emphasis on Sharing and Showcasing Uwall.tv Turns YouTube into a Video Jukebox Early Morning Sunrise at the Beach Wallpaper

    Read the article

  • Language Niches and Niche Libraries

    - by Roman A. Taycher
    "Everyone Knows" ... ... that c is widely used for low level programs in large part because operating system/device apis are usually in c. ... that Java is widely used for enterprise applications in large part because of enterprise libraries and ide support. ... that ruby is widely used for webapps thanks in large part because of rails and its library ecosytem But lets go into to details what are the specific niches and subniches. Especially with respect to libraries. Where might you embed lua for application scripting versus python. Where would you use Java vs C#. Which languages do different scientists use? Also which languages have libraries for these subniches? Things like bioperl/scipy/Incanter. Please no flamewars about how nice each language or environment is. This is where they used. Also no complaints about marketing/PHBs. (Manually migrated) I asked this question again after it was closed on stackoverflow.com

    Read the article

  • What is the right way to Windows 7/Ubuntu 10.10 Dual-Triple Boot Partitioning for Laptop OEM?

    - by Denja
    Hi Linux Community, I find my self struggling with the ever slow and buggy windoze OS once again. It's Time to change with the Ubuntu 10.10 64bit as a really faster Operating System. My Hard Disk laptop as a RECOVERY and HP_TOOLS partition they are both Primary. I Have the System Recovery DVD for Windows 64bit should anything happen. Here's the layout I used with windows before: * (C:) Windows 7 system partition NTFS - 284,89GB (Primary,Boot,Pagefile,Dump) * HP_TOOLS system partition FAT32 - 99MB (Primary) * (D:) RECOVERY partition NTFS - 12,90GB (Primary) * SYSTEM partition NTFS 199MB (Primary) Here's the layout I want to make based on your answers * (C:) Windows 7 system partition NTFS - 60GB (Primary) (sda1) * (D:) Windows DATA partition (user files) NTFS - 120GB(Primary)(sda2);wanna share with Linux * Linux root Ext4 - 100GB (Primary)(sda3) (Ubuntu 10.10 64bit) * Linux swap swap- RAM size, 3GB (sda4) * Linux root Ext3- 15,9GB (Extended)(sda5) (OpenSuse or Puppy) Here is my New Ubuntu 10.10 64bit layout in use now: * SYSTEM partition NTFS 199MB (Primary) (sda1) **Partition 1 does not end on cylinder boundary.(?)** * (C:) Windows 7 system partition NTFS - 90GB (Primary) (sda2) * (D:) Windows 7 RECOVERY partition NTFS - 12,90GB (Primary) (sda3) * Linux system partition EXTENDED - 195GB (Logical) * Linux root Ext4- 10GB (Extended) (sda5) * Linux home Ext3- 185GB (Extended) (sda6) I didn't know if I could wipe all previous partitions when i installed Ubuntu because of the RECOVERY partition so I just made the space for my extended partition by deleting the HP_TOOLS (Fat32). By doing this I managed to make and successfully install Ubuntu 64 but I couldn't actually make the partition for the swap or a third Linux OS. Question 1: What is the right way to Windows 7/Ubuntu 10.10 Dual-Triple Boot Partitioning for Laptop OEM?? Thank you in advance for your advises and suggestions and Happy New Year to All!!

    Read the article

  • How to set up an inter-OS partition?

    - by Confuzzled Persun
    I need a working partition configuration for use and accessibility on both Ubuntu and Windows. I have an 8GB USB flash drive onto which I am installing Ubuntu 11.10 so that I can have a personal bootable OS wherever I go. I've installed Ubuntu several times, but I just can't seem to get this one partition right. This is my own configuration: Partition 1: Primary - 200MB - Beginning - Ext4 - /boot Partition 2: Primary - 1300MB - End - swap area Partition 3: Logical - 5200MB - Beginning - Ext4 - / Partition 4: Logical - 1258MB - Beginning - Ext4 - /home Partition 5: Logical - 42MB - End - FAT32? - /windows? What I want to do is to get partition 5 configured so I can access it on both the installed Ubuntu system and a Windows system (when the USB drive is connected while Windows is booted). Basically, what I want is Ubuntu installed on the USB drive along with a partition that I can access with other operating systems. I'm thinking I just need the technical configuration of "Use as:" and "Mount point:" for my final partition. But I don't know. Any help with this is appreciated. And any other tips are appreciated as well.

    Read the article

  • Introducing Oracle Multitenant

    - by OracleMultitenant
    0 0 1 1142 6510 Oracle Corporation 54 15 7637 14.0 Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman"; mso-fareast-language:JA;} The First Database Designed for the Cloud Today Oracle announced the general availability (GA) of Oracle Database 12c, the first database designed for the Cloud. Oracle Multitenant, new with Oracle Database 12c, is a key component of this – a new architecture for consolidating databases and simplifying operations in the Cloud. With this, the inaugural post in the Multitenant blog, my goal is to start the conversation about Oracle Multitenant. We are very proud of this new architecture, which we view as a major advance for Oracle. Customers, partners and analysts who have had previews are very excited about its capabilities and its flexibility. This high level review of Oracle Multitenant will touch on our design considerations and how we re-architected our database for the cloud. I’ll briefly describe our new multitenant architecture and explain it’s key benefits. Finally I’ll mention some of the major use cases we see for Oracle Multitenant. Industry Trends We always start by talking to our customers about the pressures and challenges they’re facing and what trends they’re seeing in the industry. Some things don’t change. They face the same pressures and the same requirements as ever: Pressure to do more with less; be faster, leaner, cheaper, and deliver services 24/7. Big companies have achieved scale. Now they want to realize economies of scale. As ever, DBAs are faced with the challenges of patching and upgrading large numbers of databases, and provisioning new ones.  Requirements are familiar: Performance, scalability, reliability and high availability are non-negotiable. They need ever more security in this threatening climate. There’s no time to stop and retool with new applications. What’s new are the trends. These are the techniques to use to respond to these pressures within the constraints of the requirements. With the advent of cloud computing and availability of massively powerful servers – even engineered systems such as Exadata – our customers want to consolidate many applications into fewer larger servers. There’s a move to standardized services – even self-service. Consolidation Consolidation is not new; companies have tried various different approaches to consolidation of databases in the cloud. One approach is to partition a powerful server between several virtual machines, one per application. A downside of this is that you have the resource and management overheads of OS and RDBMS per VM – that is, per application. Another is that you have replaced physical sprawl with virtual sprawl and virtual sprawl is still expensive to manage. In the dedicated database model, we have a single physical server supporting multiple databases, one per application. So there’s a shared OS overhead, but RDBMS process and memory overhead are replicated per application. Let's think about our traditional Oracle Database architecture. Every time we create a database, be it a production database, a development or a test database, what do we do? We create a set of files, we allocate a bunch of memory for managing the data, and we kick off a series of background processes. This is replicated for every one of the databases that we create. As more and more databases are fired up, these replicated overheads quickly consume the available server resources and this limits the number of applications we can run on any given server. In Oracle Database 11g and earlier the highest degree of consolidation could be achieved by what we call schema consolidation. In this model we have one big server with one big database. Individual applications are installed in separate schemas or table-owners. Database overheads are shared between all applications, which affords maximum consolidation. The shortcomings are that application changes are often required. There is no tenant isolation. One bad apple can spoil the whole batch. New Architecture & Benefits In Oracle Database 12c, we have a new multitenant architecture, featuring pluggable databases. This delivers all the resource utilization advantages of schema consolidation with none of the downsides. There are two parts to the term “pluggable database”: "pluggable", which is new, and "database", which is familiar.  Before we get to the exciting new stuff let’s discuss what hasn’t changed. A pluggable database is a fully functional Oracle database. It’s not watered down in any way. From the perspective of an application or an end user it hasn’t changed at all. This is very important because it means that no application changes are required to adopt this new architecture. There are many thousands of applications built on Oracle databases and they are all ready to run on Oracle Multitenant. So we have these self-contained pluggable databases (PDBs), and as their name suggests, they are plugged into a multitenant container database (CDB). The CDB behaves as a single database from the operations point of view. Very much as we had with the schema consolidation model, we only have a single set of Oracle background processes and a single, shared database memory requirement. This gives us very high consolidation density, which affords maximum reduction in capital expenses (CapEx). By performing management operations at the CDB level – “managing many as one” – we can achieve great reductions in operating expenses (OpEx) as well, but we retain granular control where appropriate. Furthermore, the “pluggability” capability gives us portability and this adds a tremendous amount of agility. We can simply unplug a PDB from one CDB and plug it into another CDB, for example to move it from one SLA tier to another. I'll explore all these new capabilities in much more detail in a future posting.  Use Cases We can identify a number of use cases for Oracle Multitenant. Here are a few of the major ones. 0 0 1 113 650 Oracle Corporation 5 1 762 14.0 Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman"; mso-fareast-language:JA;} Development / Testing where individual engineers need rapid provisioning and recycling of private copies of a few "master test databases" Consolidation of disparate applications using fewer, more powerful servers Software as a Service deploying separate copies of identical applications to individual tenants Database as a Service typically self-service provisioning of databases on the private cloud Application Distribution from ISV / Installation by Customer Eliminating many typical installation steps (create schema, import seed data, import application code PL/SQL…) - just plug in a PDB! High volume data distribution literally via disk drives in envelopes distributed by truck! - distribution of things like GIS or MDM master databases …various others! Benefits Previous approaches to consolidation have involved a trade-off between reductions in Capital Expenses (CapEx) and Operating Expenses (OpEx), and they’ve usually come at the expense of agility. With Oracle Multitenant you can have your cake and eat it: Minimize CapEx More Applications per server Minimize OpEx Manage many as one Standardized procedures and services Rapid provisioning Maximize Agility Cloning for development and testing Portability through pluggability Scalability with RAC Ease of Adoption Applications run unchanged It’s a pure deployment choice. Neither the database backend nor the application needs to be changed. In future postings I’ll explore various aspects in more detail. However, if you feel compelled to devour everything you can about Oracle Multitenant this very minute, have no fear. Visit the Multitenant page on OTN and explore the various resources we have available there. Among these, Oracle Distinguished Product Manager Bryn Llewellyn has written an excellent, thorough, and exhaustively detailed White Paper about Oracle Multitenant, which is available here.  Follow me  I tweet @OraclePDB #OracleMultitenant

    Read the article

  • Setting up Cluster Configuration using an existing web server as a Primary Node?

    - by RapidWebs
    Thanks in advance for any help which is issued! I am having a slight issue, and need help with the decision making process when it comes to setting up my Cluster Configuration, consisting on a line of Ubuntu Servers (12.04). We currently have a Primary node, which resides in the US within a Datacenter, but we are going to be using this for all serious bandwidth and resource intensive websites, and through a configuration of Virtualmin + Webmin, will be setup as a sort of pseudo-cluster, using Virtualmins Cluster Modules. Anyways, on to the issue: We also have a business line setup locally, with three servers. here are their specs: Intel P4 2.4 ghz, 1GB Ram, 110 gb sata, Ubuntu 12.04* AMD 1.3 ghz, 512MB Ram, 20 GB IDE P3 Xeon 800mhz (dual physical processors), 1GB Ram, 3 * 25 GB Raid Configuration (one in use for host operating system). The first machine is currently IN USE and is serving virtual hosts off a sub-domain. My question is this: How can I integrate the Secondary node (which will be the Primary node per say, in this smaller configuration...) which is currently in use, into the cluster configuration w/ the other two servers for: Sharing Resources Redundancy (HA?) NFS /w the two Raid Disks without having the FORMAT the secondary node, and start fresh moving all my services in to a DRBD network drive or something similar, and than restoring all active virtualmin's Virtual hosts. the idea is that I want minimal downtime to people currently being served from server2.mywebsite.com, and from what I understand, all services need to be on a NFS so that they can be mounted on demand and accessed from the other machine taking over (i.e. Heartbeat + DRBD Config.) but my issue is that i already have all these services installed to their default directory structure: how can i most easily setup this NFS and HA system, move all my desires services to this new drive, and do it with minimal down time, and without breaking Virtualmin and everything else on my server? even just some pointers, a thread i could read, or a step by step check list or run down of commands i could issue to get started would be great! thanks!

    Read the article

  • Ubuntu 12.04 Server ping gateway responds with destination host unreachable

    - by blckblttkd
    I consider myself fairly avid with Ubuntu and Linux, but this one has me stumped. I built up a Xen Server using Ubuntu 12.04 as the base operating system. It has multiple domUs running on it. My home network has a statically defined network where I got all the network connectivity going peachy. The server was moved to a permanent home this morning. So, the network configuration on the main system had to change. Again, another static network, but now I can't ping the upstream gateway from the host. As the VMs use this NIC over a bridge, they too are broken. Ping responds with "destination host unreachable." I simplified the networking down to a simple static network as seen below (no bridge or anything) just to get it to work. Here's the contents of my /etc/network/interfaces file: auto lo iface lo inet loopback auto eth0 iface eth0 inet static address 216.7.188.228 gateway 216.7.188.225 netmask 255.255.255.240 broadcast 216.7.188.255 network 216.7.188.0 dns-nameservers 8.8.8.8 8.8.4.4 Here's the contents of route -n 0.0.0.0 216.7.188.225 0.0.0.0 UG 100 0 0 eth0 216.7.188.224 0.0.0.0 255.255.255.240 U 0 0 0 eth0 And the results of pinging the gateway: PING 216.7.188.225 (216.7.188.225) 56(84) bytes of data. From 216.7.188.228 icmp_seq=1 Destination Host Unreachable From 216.7.188.228 icmp_seq=1 Destination Host Unreachable From 216.7.188.228 icmp_seq=1 Destination Host Unreachable Again, this worked in one network flawlessly (obviously with different parameters in the interfaces file). I did try using eth1 (as there are two NICS on the server (in case the MAC address got flipped on bootup). No success there. Yes, the cable is in the right port now :) Any thoughts? I appreciate the help!

    Read the article

  • How to Get a Smartphone-Style Word Suggestion on Windows

    - by Zainul Franciscus
    Have you ever wished that you can type faster and better in Windows ? Then you’re in luck, because today we’ll show you how to get a smartphone’s word suggestion in Windows. To accomplish that, you need to install AI Type, a software that gives word suggestion when you write in Windows.  AI Type not only fulfils our gratification to have a smartphone-style word suggestion for Windows,  AI Type also improves our writings by suggesting word according to its context. It  will also try to match words according to the  probability in which other users may have used it. Installing AI Type is a breeze; Just download the installer from AI Type website, run the executable, fill in a registration form, and you’re all set to use AI Type for your daily writing. Once you’re done with the installation, AI Type appears on your system tray. Latest Features How-To Geek ETC Macs Don’t Make You Creative! So Why Do Artists Really Love Apple? MacX DVD Ripper Pro is Free for How-To Geek Readers (Time Limited!) HTG Explains: What’s a Solid State Drive and What Do I Need to Know? How to Get Amazing Color from Photos in Photoshop, GIMP, and Paint.NET Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Sync Blocker Stops iTunes from Automatically Syncing The Journey to the Mystical Forest [Wallpaper] Trace Your Browser’s Roots on the Browser Family Tree [Infographic] Save Files Directly from Your Browser to the Cloud in Chrome and Iron The Steve Jobs Chronicles – Charlie and the Apple Factory [Video] Google Chrome Updates; Faster, Cleaner Menus, Encrypted Password Syncing, and More

    Read the article

  • Internet is far slower in Ubuntu than Windows 7 on dual-booted machine

    - by Tim
    Edit: I'll leave the original post as-is, but after further investigation, it appears that the problem is something to do with my wi-fi card. Speeds are normal when I connect via cable. Edit 2: Problem was solved. It was something to do with the wireless card drivers. I normally use Windows 7 on my laptop and have internet speeds that are normally about 15-20 Mb/s. I have recently dual-booted with Ubuntu 12.10, and have noticed that internet speeds are drastically slower in Ubuntu. When tested, speeds range from 0.2-2 Mb/s, although occasionally being significantly faster than that or even stopping completely for short periods of time. I've also noticed that when first booting into Ubuntu, speeds start fairly fast, and drop to incredibly slow with a few seconds to a few minutes. There's still some possibility that the issue may be with my ISP, as things seem slower than usual even in Windows, but I suspect that it is related to Ubuntu, as things are far slower in Ubuntu than in Windows. I'm wondering, what could be the cause of this? Potentially relevant information: -I've dual booted before on this machine with earlier versions of Ubuntu (different ISP at the time) with no problem. ISP: Rogers (Major Canadian ISP) System info (Gateway NV53a Laptop): Operating System MS Windows 7 Home Premium 64-bit CPU AMD Phenom II N970 Caspian 45nm Technology RAM 6.00 GB Dual-Channel DDR3 @ 664MHz (9-9-9-24) Motherboard Gateway SJV51_DN (Socket S1G4) Graphics Generic PnP Monitor (1366x768@60Hz) ATI Mobility Radeon HD 4250 (Acer Incorporated [ALI]) Hard Drives 733GB TOSHIBA TOSHIBA MK7559GSXP ATA Device (SATA) Networking info: Connected through Wi-Fi Atheros AR5B97 Wireless Network A

    Read the article

  • Resolution stuck in 640x480 in grub, 11.04 and 12.04

    - by user89797
    I have three operating systems on my machine, Windows 7x64, Ubuntu 11.10 and 12.04 both x64 as well. All three were running at full resolution for my monitor, as well as in the Grub 1.99 boot screen. After booting into Windows, I rebooted my machine and found my Grub resolution was suddenly 640x480. Booting into both versions of Ubuntu, I find myself stuck at that resolution as well. I made no driver changes recently, and hadn't even booted into the 11.10 build in a month or more. I've gone through both proprietary Nvidia driver options for my card (GeForce 9800GT) as well as the open source drivers in 12.04 to no avail. I can't figure out what could have caused this change in both versions of Ubuntu and Grub simultaneously. Windows 7 is unaffected so I think that safely rules out hardware failure. EDIT Ok, so I couldn't boot an graphical live disks, I tried ubuntu 12.04 i386 and x64 as well as 12.10 beta x64 and all of them would flash the initial logo, go to a blank screen with a flashing cursor in the upper left and then my display would die. I managed to boot 12.04 server and get into recovery. I reinstalled grub and went into recovery mode for my 12.04 build. If I boot in safe graphics mode I can get 1280x768, but as soon as I reboot it's broken again. I've tried reinstalling the nvidia drivers and that leaves me with a system stuck at max 640x480. None of these changes have had any impact on the 11.10 build, which is still stuck at 640x480 Given that I can push a somewhat higher resolution in 12.04, and full resolution in windows 7 I'm pretty convinced it's not an issue of my monitor failing. It must be something to do with the graphics drivers. I can't figure out what could be the issue though. I'm especially perplexed that I can't boot any live images

    Read the article

  • How to view/mount other partitions on your hard drive

    - by Preston Zacharias
    Recently I have installed Ubuntu 12.04 Beta 2 on a USB flash drive and decided to install it on an old external HDD which I have taken out of the casing and succesfully mounted in my desktop computer. There is no other operating system besides the newly install Ubuntu. However, there is about 500gb of data on the drive. This is why i used a partitioning software on my windows 7 netbook to partition the hard drive to set aside 1tb for files, 350gb of space for linux and the remaining 650gb for Vista which i plan on installing soon. But this is where the problem sets in...when installing Ubuntu it does not recognize that the drive is partitioned at all, it's just one big open block of space...so I used the installers built in partitioning feature to set aside 300gb for main Ubuntu install and 50gb for swap space. I set both of these partitions to be created at the "end" so that it wouldn't delete or write over my data. And this is where i am really lost; when booting into Ubuntu i am able to use it perfectly fine, got on internet, etc...but i have NO CLUE as to how i can view files that were previously on the drive (all of my data that i had prior to install). How can I mount/be able to view the other partition so that i can have access to my data? Thank you ahead of time! I REALLY appreciate any help or advice! ~Preston

    Read the article

  • Non use of persisted data – Part deux

    - by Dave Ballantyne
    In my last blog I showed how persisted data may not be used if you have used the base data on an include on an index. That wasn't the only problem ive had that showed the same symptom.  Using the same code as before,  I was executing similar to the below : select BillToAddressID,SOD.SalesOrderDetailID,SOH.CleanedGuid from sales.salesorderheader SOH join Sales.SalesOrderDetail SOD on SOH.SalesOrderID = SOD.SalesOrderID But,  due to a distribution error in statistics i found it necessary to use a table hint.  In this case, I wanted to force a loop join select BillToAddressID,SOD.SalesOrderDetailID,SOH.CleanedGuid from sales.salesorderheader SOH inner loop join Sales.SalesOrderDetail SOD on SOH.SalesOrderID = SOD.SalesOrderID   But, being the diligent  TSQL developer that I am ,looking at the execution plan I noticed that the ‘compute scalar’ operator was again calling the function.  Again,  profiler is a more graphic way to view this…..   All very odd,  just because ive forced a join , that has NOTHING, to do with my persisted data then something is causing the data to be re-evaluated. Not sure if there is any easy fix you can do to the TSQL here, but again its a lesson learned (or rather reinforced) examine the execution plan of every query you write to ensure that it is operating as you thought it would.

    Read the article

  • SPARC Solaris Momentum

    - by Mike Mulkey-Oracle
    Following up on the Oracle Solaris 11.2 launch on April 29th, if you were able to watch the launch event, you saw Mark Hurd state that Oracle will be No. 1 in high-end computing systems "in a reasonable time frame”.  "This is not a 3-year vision," he continued.Well, According to IDC's latest 1QCY14 Tracker, Oracle has regained the #1 UNIX Shipments Marketshare! You can see the report and read about it here: Oracle regains the #1 UNIX Shipments Marketshare, but suffice to say that SPARC Solaris is making strong gains on the competition.  If you have seen the public roadmap through 2019 of Oracle's commitment to continue to deliver on this technology, you can see that Mark Hurd’s comment was not to be taken lightly.  We feel the systems tide turning in Oracle's direction and are working hard to show our partner community the value of being a part of the SPARC Solaris momentum.We are now planning for the Solaris 11.2 GA in late summer (11.2 beta is available now), as well as doing early preparations for Oracle OpenWorld 2014 on September 28th.  Stay tuned there!Here is a sampling of the coverage highlights around the Oracle Solaris 11.2 launch:“Solaris is still one of the most advanced platforms in the enterprise.” – ITBusinessEdge“Oracle is serious about clouds now, just as its customers are, whether they are building them in their own datacenters or planning to use public clouds.” – EnterpriseTech"Solaris is more about a layer of an integrated system than an operating system.” — ZDNet

    Read the article

  • browser without gpu support

    - by manuzhang
    Google has an Easter egg that draws 3D graph but when I tried it out on chrome it complained about no WebGL support. I've also tested it on Firefox whose WebGL support was enabled but ended up with the same problem. Thus, I suspect it's an issue of my GPU. Some googling led me to chrome://gpu and here's what I got Graphics Feature Status Canvas: Software only, hardware acceleration unavailable HTML Rendering: Software only, hardware acceleration unavailable 3D CSS: Unavailable. Hardware acceleration unavailable WebGL: Unavailable. Hardware acceleration unavailable WebGL multisampling: Unavailable. Hardware acceleration unavailable Problems Detected GPU process was unable to boot. Access to GPU disallowed. GL driver is software rendered. Accelerated compositing is disabled.: 59302 Mesa drivers in linux older than 7.11 are assumed to be buggy. Accelerated 2d canvas is unstable in Linux at the moment. Version Information Data exported Tue Apr 10 2012 18:35:57 GMT+0800 (CST) Chrome version 18.0.1025.151 (Official Build 130497) Operating system Linux 3.0.0-0300-generic Software rendering list version 1.27 ANGLE revision 988 2D graphics backend Skia I wonder what each of the problem implies and How I may properly deal with it? I'm using Ubuntu 11.04

    Read the article

  • Installing Ubuntu 12.04 on a single GPT SSD which contains Windows 7

    - by Gary
    I recently bought a brand new 64 bit PC with a (ASUS) motherboard that supports UEFI and a GPT formatted 240Gb SSD, which contains Windows 7 in the first of 3 (80Gb) partitions. When the system arrived, it booted into Windows 7 like a dream, with no problems. I did not originally want Windows, but the manufacturer does not work with Linux (of any flavour), so I thought I would install Ubuntu into the second partition and dual boot. I downloaded the 12.04 64bit version and proceeded to install. Having selected to 'install', the screen became corrupted, with multicoloured garbage across the middle third of the screen !! So, I rebooted and --- MISSING OPERATING SYSTEM !!! The only way I can now get into Windows is via Super Grub2. First question - what went wrong ? 2nd - Will Ubuntu install on a GPT disk partition ? 3rd - Will it install alongside Windows 7 without screwing the boot mechanism ? 4th - How do I do it ? I have scoured the internet looking for appropriate answers and found NONE ! Please help.......

    Read the article

  • Overwhelmed by complex C#/ASP.NET project in Visual Studio 2008

    - by Darren Cook
    I have been hired as a junior programmer to work on projects that extend existing functionality in a very large, complex solution. The code base consists of C#, ASP.NET, jQuery, javascript, html and xml. I have some knowledge of all these in addition to fair knowledge of object-oriented programming and its fundamental concepts of inheritance, abstraction, polymorphism and encapsulation. I can follow code up through its base classes, interfaces, abstract classes and understand a large part of the code that I read while doing this. However, this solution is so humongous and so many things get tied together whenever I navigate through the code that I feel absolutely overwhelmed. I often find myself unable to fully follow everything that is going on with objects being serialized, large amounts of C# and javascript operating on the same pages and methods being called from template files that consist mainly of markup. I love learning about code, but trying to deal with this really stresses me out. Additionally, I do know that a significant amount of unit testing has been done but I know nothing about unit testing or how to utilize it. Any advice anyone could offer me regarding dealing with a large code base while using Visual Studio 2008 would be greatly appreciated. Are there tools that I can use to help get a handle on what is going on? Perhaps there are things even in Visual Studio that I am not aware of. How can I follow the code to low level functionality in order to get a better grasp of what is going on at a high level?

    Read the article

  • June 2012 Critical Patch Update for Java SE Released

    - by Eric P. Maurice
    Hi, this is Eric Maurice. Oracle just released the June 2012 Critical Patch Update for Java SE.  This Critical Patch Update provides 14 new security fixes across Java SE products.  As discussed in previous blog entries, Critical Patch Updates for Java SE will, for the foreseeable future, continue to be released on a separate schedule than that of other Oracle products due to previous commitments made to Java customers.  12 of the 14 Java SE vulnerabilities fixed in this Critical Patch Update may be remotely exploitable without authentication.  6 of these vulnerabilities have a CVSS Base Score of 10.0.  In accordance with Oracle’s policies, these CVSS 10 scores represent instances where a user running a Java applet or Java Web Start application has administrator privileges (as is typical on Windows XP).  When the user does not run with administrator privileges (typical on the Solaris and Linux operating systems), the corresponding CVSS impact scores for Confidentiality, Integrity, and Availability for these vulnerabilities would be "Partial" instead of "Complete", thus lowering these CVSS Base Scores to 7.5. Due to the high severity of these vulnerabilities, Oracle recommends that customers obtain and apply these security fixes as soon as possible: Developers should download the latest release at http://www.oracle.com/technetwork/java/javase/downloads/index.html    Java users should download the latest release of JRE at http://java.com, and of course  Windows users can take advantage of the Java Automatic Update to get the latest release. In addition, Oracle recommends removing old an unused versions  of Java as the latest version is always the recommended version as it contains the most recent enhancements, and bug and security fixes.  For more information: •Instructions on removing older (and less secure) versions of Java can be found at http://java.com/en/download/faq/remove_olderversions.xml  •Users can verify that they’re running the most recent version of Java by visiting: http://java.com/en/download/installed.jsp   •The Advisory for the June 2012 Critical Patch Update for Java SE is located at http://www.oracle.com/technetwork/topics/security/javacpujun2012-1515912.html

    Read the article

  • Oracle Exalogic Customer Momentum @ OOW'12

    - by Sanjeev Sharma
    [Adapted from here]  At Oracle Open World 2012, i sat down with some of the Oracle Exalogic early adopters  to discuss the business benefits these businesses were realizing by embracing the engineered systems approach to data-center modernization and application consolidation. Below is an overview of the 4 businesses that won the Oracle Fusion Middleware Innovation Award for Oracle Exalogic this year. Company: Netshoes About: Leading online retailer of sporting goods in Latin America.Challenges: Rapid business growth resulted in frequent outages and poor response-time of online store-front Conventional ad-hoc approach to horizontal scaling resulted in high CAPEX and OPEX Poor performance and unavailability of online store-front resulted in revenue loss from purchase abandonment Solution: Consolidated ATG Commerce and Oracle WebLogic running on Oracle Exalogic.Business Impact:Reduced abandonment rates resulting in a two-digit increase in online conversion rates translating directly into revenue up-liftCompany: ClaroAbout: Leading communications services provider in Latin America.Challenges: Support business growth over the next 3  - 5 years while maximizing re-use of existing middleware and application investments with minimal effort and risk Solution: Consolidated Oracle Fusion Middleware components (Oracle WebLogic, Oracle SOA Suite, Oracle Tuxedo) and JAVA applications onto Oracle Exalogic and Oracle Exadata. Business Impact:Improved partner SLA’s 7x while improving throughput 5X and response-time 35x for  JAVA applicationsCompany: ULAbout: Leading safety testing and certification organization in the world.Challenges: Transition from being a non-profit to a profit oriented enterprise and grow from a $1B to $5B in annual revenues in the next 5 years Undertake a massive business transformation by aligning change strategy with execution Solution: Consolidated Oracle Applications (E-Business Suite, Siebel, BI, Hyperion) and Oracle Fusion Middleware (AIA, SOA Suite) on Oracle Exalogic and Oracle ExadataBusiness Impact:Reduced financial and operating risk in re-architecting IT services to support new business capabilities supporting 87,000 manufacturersCompany: Ingersoll RandAbout: Leading manufacturer of industrial, climate, residential and security solutions.Challenges: Business continuity risks due to complexity in enforcing consistent operational and financial controls; Re-active business decisions reduced ability to offer differentiation and compete Solution: Consolidated Oracle E-business Suite on Oracle Exalogic and Oracle ExadataBusiness Impact:Service differentiation with faster order provisioning and a shorter lead-to-cash cycle translating into higher customer satisfaction and quicker cash-conversionCheck out the winners of the Oracle Fusion Middleware Innovation awards in other categories here.

    Read the article

  • Microsoft releases Visual Studio 2010 SP1

    - by brian_ritchie
    Microsoft has been beta testing SP1 since December of last year.  Today, it was released to MSDN subscribers and will be available for public download on March 10, 2011.The service pack includes a slew of fixes, and a number of new features: Silverlight 4 supportBasic Unit Testing support for the .NET Framework 3.5Performance Wizard for SilverlightIntelliTrace for 64-bit and SharePointIIS Express supportSQL CE 4 supportRazor supportHTML5 and CSS3 support (IntelliSense and validation)WCF RIA Services V1 SP1 includedVisual Basic Runtime embeddingALM Improvements Of all the improvements, IIS Express probably has the largest impact on web developer productivity.  According to Scott Gu, it provides the following:It’s lightweight and easy to install (less than 10Mb download and a super quick install)It does not require an administrator account to run/debug applications from Visual Studio It enables a full web-server feature set – including SSL, URL Rewrite, Media Support, and all other IIS 7.x modules It supports and enables the same extensibility model and web.config file settings that IIS 7.x support It can be installed side-by-side with the full IIS web server as well as the ASP.NET Development Server (they do not conflict at all) It works on Windows XP and higher operating systems – giving you a full IIS 7.x developer feature-set on all OS platforms IIS Express (like the ASP.NET Development Server) can be quickly launched to run a site from a directory on disk.  It does not require any registration/configuration steps. This makes it really easy to launch and run for development scenarios.Good stuff indeed.  This will make our lives much easier.  Thanks Microsoft...we're feeling the love!  

    Read the article

  • Why is purchasing Microsoft licences such a daunting task? [closed]

    - by John Nevermore
    I've spent 2 frustrating days jumping through hoops and browsing through different local e-shops for VS (Visual Studio) 2010 Pro. And WHS (Windows Home Server) FPP 2011 licenses. I found jack .. - or to be more precise, the closest I found in my country was WHS OEM 2011 licenses after multiple emails sent to individuals found on Microsoft partners page. Question being, why is it so difficult to get your hands on Microsoft licenses as an individual? Sure, you can get the latest end user operating systems from most shops, but when it comes to development tools or server software you are left dry. And companies that do sell licenses most of the time don't even put up pricing or a self service environment for buying the licenses, you need to have an hawk's eye for that shiny little Microsoft partner logo and spam through bunch of emails not knowing, if you can count on them to get the license or not. Sure, i could whip out my credit card and buy the VS 2010 license on the online Microsoft Shop. Well whippideegoddamndoo, they sell that, but they don't sell WHS 11 licenses. Why does a company make it so hard to buy their products? Let's not even talk about the licensing itself being a pain.

    Read the article

  • NetBeans has broken interface with OpenJDK

    - by Krzysztof Stanislawek
    Sorry for English. Usually (say, at 4/5 of times), when I start NetBeans, interface is partially broken. For example, menu items are appearing and at once disappearing after click; some parts of editor are also blocked for cursor actions. Another issue is broken Options window - I can show it only once for NetBeans execution. There are more similar issues. $ java -version java version "1.7.0_55" OpenJDK Runtime Environment (IcedTea 2.4.7) (7u55-2.4.7-1ubuntu1~0.13.10.1) OpenJDK 64-Bit Server VM (build 24.51-b03, mixed mode) Version of NetBeans is 7.0.1. Linux version is Mint 16. I use Mate. I had the same issue at other computer, at the same operating system. I know that installing Sun version of Java could help, but I also know that there shouldn't be such issue with OpenJDK. Also, I had other problem with NetBeans and Sun Java 8 - NetBeans suddenly crashed before he fully started, so I want to stick with OpenJDK. What causes my problem? Should I use other version of NetBeans/OpenJDK?

    Read the article

  • Booting Ubuntu 13.10 form USB on server with no OS (Dell PowerEdge T110)

    - by user35581
    I have a Dell PowerEdge T110 (Xeon 1220v2) server with no OS. From my mac, I was able to save the Ubuntu 13.10 for server iso (x86) and used UNetbootin to save the iso to a USB stick (2GB). I was hoping to boot the server from USB, and all seemed to be going well, BIOS even detected my USB drive when I plugged it in, but for some reason I'm getting a "Missing operating system" error. I checked the USB drive and it appears that UNetbootin put the correct files on it form a cursory glance (although I'm not entirely sure what I should be looking for). Should I be able to boot a server with no OS from a UNetbootin created Ubuntu 13.10 USB? And if so, why might BIOS not find the right files? I had read that there is a USB Emulation setting in BIOS, but I haven't been able to find this in the menus. My understanding is that by default, this is set to on. I might try wiping the USB stick and running UNetbootin again. The USB is formatted to MS-DOS FAT16 (should it be formatted some other way?).

    Read the article

  • Why is Windows registry needed?

    - by Job
    As I have debugged problems in com, side by side, dealt with dll hell, all while hating the windows registry with passion, I was wondering why is it needed. I never felt compelled to read an entire book on registry best practices, and then just "get it". I have, however, used Linux and Mac OS, and look at the ways one can install multiple versions of Python and its libraries on the same *nix computer. Because registry has somewhat of a free (albeit ugly) format, and is used for all sorts of purposes, I have never understood what essential problem it is trying to solve. For instance, Microsoft does not want you to have two different versions of MS Office installed side by side. They use registry to enforce this during installation. This limitation is artificial, in my opinion. If they really cared to allow a different behavior, they could have adjusted their architecture accordingly. In Mac OS you can install and remove apps by just dropping them into a particular folder. So, A) What essential problem it is trying to solve? B) How do other operating systems solve it?

    Read the article

  • Can't run minecraft on ubuntu 12.04 lts [duplicate]

    - by user170011
    This question already has an answer here: How to correctly install and troubleshoot Minecraft (Client) 3 answers I was trying to run minecraft on my laptop with ubuntu 12.04 lts 64 bit. I have a lenovo ideapad p580 with 7.7 Gb and an Intel® Core™ i7-3520M CPU @ 2.90GHz × 4 processor. Under the graphics section of the system overview in ubuntu it says I have none installed. My computer comes with and nvidia geforce graphics card but it isnt recognized. When I start minecraft I get this crash report. ---- Minecraft Crash Report ---- // Shall we play a game? Time: 24/06/13 7:23 PM Description: Failed to start game org.lwjgl.LWJGLException: Could not init GLX at org.lwjgl.opengl.LinuxDisplayPeerInfo.initDefaultPeerInfo(Native Method) at org.lwjgl.opengl.LinuxDisplayPeerInfo.<init>(LinuxDisplayPeerInfo.java:52) at org.lwjgl.opengl.LinuxDisplay.createPeerInfo(LinuxDisplay.java:684) at org.lwjgl.opengl.Display.create(Display.java:854) at org.lwjgl.opengl.Display.create(Display.java:784) at org.lwjgl.opengl.Display.create(Display.java:765) at net.minecraft.client.Minecraft.a(SourceFile:235) at avv.a(SourceFile:56) at net.minecraft.client.Minecraft.run(SourceFile:507) at java.lang.Thread.run(Thread.java:679) A detailed walkthrough of the error, its code path and all known details is as follows: -- System Details -- Details: Minecraft Version: 1.5.2 Operating System: Linux (amd64) version 3.5.0-34-generic Java Version: 1.6.0_27, Sun Microsystems Inc. Java VM Version: OpenJDK 64-Bit Server VM (mixed mode), Sun Microsystems Inc. Memory: 406175448 bytes (387 MB) / 514523136 bytes (490 MB) up to 1908932608 bytes (1820 MB) JVM Flags: 2 total; -Xmx2048M -Xms512M AABB Pool Size: 0 (0 bytes; 0 MB) allocated, 0 (0 bytes; 0 MB) used Suspicious classes: No suspicious classes found. IntCache: cache: 0, tcache: 0, allocated: 0, tallocated: 0 LWJGL: 2.4.2 OpenGL: ~~ERROR~~ NullPointerException: null Is Modded: Probably not. Jar signature remains and client brand is untouched. Type: Client (map_client.txt) Texture Pack: Default Profiler Position: N/A (disabled) Vec3 Pool Size: ~~ERROR~~ NullPointerException: null I can run it on different versions of linux such as fedora.

    Read the article

  • A few tips on deploying Secure Enterprise Search with PeopleSoft

    - by Matthew Haavisto
    Oracle's Secure Enterprise Search is part of PeopleSoft now.  It is provided as part of the Peopltools platform as an appliance, and is used with applications starting with release 9.2.  Secure Enterprise Search is a rich and powerful search product that can enhance search and navigation in PeopleSoft applications.  It also provides useful features like facets and filtering that are common in consumer search engines.Several questions have arisen about the deployment of SES and how to administer it and insure optimum performance.  People have also asked about what versions are supported on various platforms.  To address the most common of these questions, we are posting this list of tips.Platform SupportSES 11.1.2.2 does not support some of the platforms supported by PeopleTools, such as Windows 2012 and AIX 7.1. However, PeopleSoft and SES can use different operating system platforms when SES is deployed on a separate machine.SES 11.2.2.2 will have the required platform support for PT 8.53 in the future. We are planning to certify PT 8.53 once the testing is complete in 8.54 development and all platform support is released for 11.2.2.2.ArchitectureWe recommend running SES on a separate machine (from your apps) for two reasons:1.    SES bundles specific WebLogic, Java, and Oracle DB versions and might need different OS patches at a minimum than PeopleSoft. By having SES run on a different machine, these pre-requisites can be managed better through their lifecycle independenly for PeopleSoft and SES.2.    SES is resource intensive - it runs it's own WebLogic and Oracle database. By having SES run on its own machine, sufficient resources can be allocated to SES and free the PeopleSoft servers from impacts of SES load patterns.We will be providing a comprehensive red paper covering PeopleSoft/SES administration in the near future, but until that is published, we'll post tips on this blog.

    Read the article

< Previous Page | 101 102 103 104 105 106 107 108 109 110 111 112  | Next Page >