Search Results

Search found 8902 results on 357 pages for 'hardware virtualization'.

Page 169/357 | < Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >

  • Ubuntu Server Configuration -- Harddrive Partitioning

    - by black_bird
    Currently Ubuntu Server is telling me that when I'm making a new partition for Ubuntu Server on this NTFS 1TB HD that I currently have installed to the hardware, that the partition must be a minimum of 52% of the hard drive space or ~521GB. I'm almost positive that this will run into other data, as I have quite a bit of stuff on the hard drive currently. Can I not make a Ubuntu server partition on that hard drive at like 100GB or something? Why does it require so much?

    Read the article

  • Can my machine run Ubuntu( kubuntu | xubuntu | mint ) 12.04 WELL?

    - by Steve
    I have a 9 year old computer packing the hardware listed below. My question is, can I run 12.04 ( Ubuntu, Kubuntu, MINT or Xubuntu ) WELL? I was running Ubuntu 10.10 and I upgraded to 12.04 by going through each release via the update manager: 11.04 - 11.10 -12.04 During the installation process for 12.04 I saw an error message that there was an error installing and setting up part of the kernel. Later, when I tried installing a package in synaptic, I got another error message mentioning the kernel. When I rebooted, I got told somehting about my video and graphics was not configured properly and that I would have to do it manually ( like I know ). It gave me the option to enter the system in low graphics mode, but it just hanged. I had an old livecd of Xubuntu 10.10 around so I used that to get into my computer and copy data over to an external hard drive. I think tried to install Xubuntu 10.10 from the livecd, with the option "download updates" checked. The install process moved along a bit, then halted for about 5 hours. I rebooted my machine and tried the Xubuntu 10.10 installer WITHOUT the option to "download updates". The install completed in about 15 minutes. So, all of that is making me wonder if there is someting about 12.04 that does not like my hardware. I'm willing to try again, but only if I know I will not have to spend hours just to get to an error message and a hosed up system like I did last night. I also think I have a lot more RAM than is being reported in the output below. I had extra ram installed last year. I'm not good with the command line readouts, but there seems like there should be a lot more. I wasn't thrilled with Unity. I am willing to try Kubuntu 12.04. Will I run into the same problems? What is the highest version of a *ubuntu can I upgrade to? Thanks CPU Model: Intel(R) Pentium(R) 4 CPU 2.53GHz Frequency: 2533.223 MHz L2 Cache: 512 KB Bogomips: 5066.44 Numbering: family(15) model(2) stepping(7) Flags: fpu vme de pse tsc msr pae mce cx8 apic mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe up pebs bts cid RAM ~$ free -mt total used free shared buffers cached Mem: 1506 891 615 0 91 521 -/+ buffers/cache: 278 1227 Swap: 1609 0 1609 Total: 3116 891 2225 Video Card 01:00.0 VGA compatible controller: nVidia Corporation NV18 [GeForce4 MX 440 AGP 8x] (rev a2) (prog-if 00 [VGA controller]) Subsystem: ASUSTeK Computer Inc. V9180 Magic Flags: bus master, 66MHz, medium devsel, latency 32, IRQ 16 Memory at fd000000 (32-bit, non-prefetchable) [size=16M] Memory at f0000000 (32-bit, prefetchable) [size=64M] Expansion ROM at fe9e0000 [disabled] [size=128K] Kernel driver in use: nouveau Kernel modules: nouveau, nvidiafb Motherboard Intel 845PE ATX 533FSB DDR333 USB2

    Read the article

  • How to handle brightness trouble on dell inspiron i14R-2265?

    - by den-javamaniac
    Hi. Recently I've installed ubuntu 10.10 32-bit on my dell inspiron i14r-2265, but it looks like brightness control is not working. I can change it "programmatically" (FN + brightness key), though actual screen brightness shows no effect. I've tried this advice but it didn't work for me. I actually have no idea how it works (if hardware is not responsive to software, the way I see it), so can someone suggest a solution please?

    Read the article

  • AMD to Introduce Netbook Chip in 2011

    <b>Hardware Central:</b> "Advanced Micro Devices plans to release a processor in its "Fusion" line that will be positioned for the netbook market, putting it in competition with the Intel Atom, and, to a lesser degree, the ARM processor."

    Read the article

  • Does "Ubuntu for Android" (12.04) work with the Samsung Galaxy S2?

    - by Charles Hadeed
    I'm trying to buy a new Android phone and I own an Ubuntu 12.04 computer... I have the choice of a Google Galaxy Nexus, Samsung Galaxy S2, and a HTC Sensation XL. I am aware that the HTC already works with it but i would prefer to buy the samsung. I already have the phone hardware specifications and have checked but i am not sure with the samsung or the nexus. So which of these phones work for Ubuntu 12.04's 'Ubuntu for Android' feature?

    Read the article

  • Should I run Ubuntu 64bit on a laptop with 2GB of RAM?

    - by nhanb
    I'm using an Asus K43E laptop with: - Intel Core i3 Sandy Bridge 2.1GHz - 2GB DDR3 - Onboard graphics On the Ubuntu download page, the 32bit version is marked as "recommended", but the community documentation page suggests otherwise: Unless you have specific reasons to choose 32-bit, we recommend 64-bit to utilise the full capacity of your hardware. I use my laptop mostly for Eclipse, apart from regular office applications, then does it make any difference when choosing between 32bit and 64bit?

    Read the article

  • Dowloaded Wubi.exe but it doesn't run on my asus 1005p ee pc running windows 7

    - by Manoj
    I want to install ubuntu 12.04 LTS along side windows 7 on my Asus 1005P Eee PC. I tried to install it with the Live USB created with the "Universal-USB-Installer-1.9.0.2.exe" but failed. Alternatively, I downloaded the "wubi.exe" installer from www.ubuntu.com, but it does not run on my pc. Is this version of ubuntu incompatible for the given hardware for side by side installation with windows 7?

    Read the article

  • Why C++ people loves multithreading when it comes to performances?

    - by user1849534
    I have a question, it's about why programmers seems to love concurrency and multi-threaded programs in general. I'm considering 2 main approach here: an async approach basically based on signals, or just an async approach as called by many papers and languages like the new C# 5.0 for example, and a "companion thread" that maanges the policy of your pipeline a concurrent approach or multi-threading approach I will just say that I'm thinking about the hardware here and the worst case scenario, and I have tested this 2 paradigms myself, the async paradigm is a winner at the point that I don't get why people 90% of the time talk about concurrency when they wont to speed up things or make a good use of their resources. I have tested multi-threaded programs and async program on an old machine with an Intel quad-core that doesn't offer a memory controller inside the CPU, the memory is managed entirely by the motherboard, well in this case performances are horrible with a multi-threaded application, even a relatively low number of threads like 3-4-5 can be a problem, the application is unresponsive and is just slow and unpleasant. A good async approach is, on the other hand, probably not faster but it's not worst either, my application just waits for the result and doesn't hangs, it's responsive and there is a much better scaling going on. I have also discovered that a context change in the threading world it's not that cheap in real world scenario, it's infact quite expensive especially when you have more than 2 threads that need to cycle and swap among each other to be computed. On modern CPUs the situation it's not really that different, the memory controller it's integrated but my point is that an x86 CPUs is basically a serial machine and the memory controller works the same way as with the old machine with an external memory controller on the motherboard. The context switch is still a relevant cost in my application and the fact that the memory controller it's integrated or that the newer CPU have more than 2 core it's not bargain for me. For what i have experienced the concurrent approach is good in theory but not that good in practice, with the memory model imposed by the hardware, it's hard to make a good use of this paradigm, also it introduces a lot of issues ranging from the use of my data structures to the join of multiple threads. Also both paradigms do not offer any security abut when the task or the job will be done in a certain point in time, making them really similar from a functional point of view. According to the X86 memory model, why the majority of people suggest to use concurrency with C++ and not just an async aproach ? Also why not considering the worst case scenario of a computer where the context switch is probably more expensive than the computation itself ?

    Read the article

  • Les WebAPIs de Mozilla progressent, les applications Web peuvent exploiter de plus en plus de fonctionnalités natives des Smartphones

    Les WebAPIs de Mozilla progressent Et permettent aux applications Web d'exploiter de plus en plus de fonctionnalités du hardware des smartphones Les frontières s'effacent entre le développement mobile et le Web. Le mouvement a commencé avec des applications hébergées pour contourner les restrictions des galeries d'applications (lire par ailleurs). Le trio HTML5, CSS3, JavaScript a fait de plus en plus d'adepte à tel point que certains lui prédisent même un avenir plus florissant que les développements natifs. La Fondation Mozilla oeuvre à sa...

    Read the article

  • No screens found error with glasen/intel-driver

    - by pgcudahy
    A lot of people seem to have had success in getting hardware acceleration for intel 82852/855GM chipsets with the ppa:glasen/intel-driver. I've tried it on my Motion Computing M1400 but get a "no screens found" error. I've found one person out there with a similar problem who seemed to fix it, but his solution is in German and seems to involve recompiling the kernel (it's at the bottom of the comments). Anyone able to see how to fix this without such drastic measures?

    Read the article

  • Intel Unveils 50-Core Supercomputing Processor

    <b>Hardware Central:</b> "Intel has announced a new multi-core processor, and the fact that it was introduced at the International Supercomputing Conference (ISC) instead of the consumer-oriented Computex show taking place at the same time should be an indication of its target market."

    Read the article

  • Dell Vostro 1520 wifi not working with Ubuntu 12.04

    - by user65696
    I have installed Ubuntu 12.04 on Dell Vostro 1520 Laptop. But found Wifi is hardware locked, though wifi is working perfectly fine with Windows 7 in the same laptop. I tried to upgrade the drivers as mentioned in other posts/forums but could not resolve the issue. I had the similar problem with versions 11.* as well. Last time the Wifi worked with 10.* version. Please help me to get it resolved.

    Read the article

  • Beyond Cloud Technology, Enabling A More Agile and Responsive Organization

    - by sxkumar
    This is the second part of the blog “Clouds, Clouds Everywhere But not a Drop of Rain”. In the first part,  I was sharing with you how a broad-based transformation makes cloud more than a technology initiative, I will describe in this section how it requires people (organizational) and process changes as well, and these changes are as critical as is the choice of right tools and technology. People: Most IT organizations have a fairly complex organizational structure. There are different groups, managing different pieces of the puzzle, and yet, they don't always work together. Provisioning a new application therefore may require a request to float endlessly through system administrators, DBAs and middleware admin worlds – resulting in long delays and constant finger pointing.  Cloud users expect end-to-end automation - which requires these silos to be greatly simplified, if not completely eliminated.  Most customers I talk to acknowledge this problem but are quick to admit that such a transformation is hard. As hard as it may be, I am afraid that the status quo is no longer an option. Sticking to an organizational structure that was created ages back will not only impede cloud adoption,  it also risks making the IT skills increasingly irrelevant in a world that is rapidly moving towards converged applications and infrastructure.   Process: Most IT organizations today operate with a mindset that they must fully "control" access to any and all types of IT services. This in turn leads to people clinging on to outdated manual approval processes .  While requiring approvals for scarce resources makes sense, insisting that every single request must be manually approved defeats the very purpose of cloud. Not only this causes delays, thereby at least partially negating the agility benefits, it also results in gross inefficiency. In a cloud environment, self-service access should be governed by policies, quotas that the administrators can define upfront . For a cloud initiative to be successful, IT organizations MUST be ready to empower users by giving them real control rather than insisting on brokering every single interaction between users and the cloud resources. Technology: From a technology perspective, cloud is about consolidation, standardization and automation. A consolidated and standardized infrastructure helps increase utilization and reduces cost. Additionally, it  enables a much higher degree of automation - thereby providing users the required agility while minimizing operational costs.  Obviously, automation is the key to cloud. Unfortunately it hasn’t received as much attention within enterprises as it should have.  Many organizations are just now waking up to the criticality of automation and it still often gets relegated to back burner in favor of other "high priority" projects. However, it is important to understand that without the right type and level of automation, cloud will remain a distant dream for most enterprises. This in turn makes the choice of the cloud management software extremely critical.  For a cloud management software to be effective in an enterprise environment, it must meet the following qualifications: Broad and Deep Solution It should offer a broad and deep solution to enable the kind of broad-based transformation we are talking about.  Its footprint must cover physical and virtual systems, as well as infrastructure, database and application tiers. Too many enterprises choose to equate cloud with virtualization. While virtualization is a critical component of a cloud solution, it is just a component and not the whole solution. Similarly, too many people tend to equate cloud with Infrastructure-as-a-Service (IaaS). While it is perfectly reasonable to treat IaaS as a starting point, it is important to realize that it is just the first stepping stone - and on its own it can only provide limited business benefits. It is actually the higher level services, such as (application) platform and business applications, that will bring about a more meaningful transformation to your enterprise. Run and Manage Efficiently Your Mission Critical Applications It should not only be able to run your mission critical applications, it should do so better than before.  For enterprises, applications and data are the critical business assets  As such, if you are building a cloud platform that cannot run your ERP application, it isn't truly a "enterprise cloud".  Also, be wary of  vendors who try to sell you the idea that your applications must be written in a certain way to be able to run on the cloud. That is nothing but a bogus, self-serving argument. For the cloud to be meaningful to enterprises, it should adopt to your applications - and not the other way around.  Automated, Integrated Set of Cloud Management Capabilities At the root of many of the problems plaguing enterprise IT today is complexity. A complex maze of tools and technology, coupled with archaic  processes, results in an environment which is inflexible, inefficient and simply too hard to manage. Management tool consolidation, therefore, is key to the success of your cloud as tool proliferation adds to complexity, encourages compartmentalization and defeats the very purpose that you are building the cloud for. Decision makers ought to be extra cautious about vendors trying to sell them a "suite" of disparate and loosely integrated products as a cloud solution.  An effective enterprise cloud management solution needs to provide a tightly integrated set of capabilities for all aspects of cloud lifecycle management. A simple question to ask: will your environment be more or less complex after you implement your cloud? More often than not, the answer will surprise you.  At Oracle, we have understood these challenges and have been working hard to create cloud solutions that are relevant and meaningful for enterprises.  And we have been doing it for much longer than you may think. Oracle was one of the very first enterprise software companies to make our products available on the Amazon Cloud. As far back as in 2007, we created new cloud solutions such as Cloud Database Backup that are helping customers like Amazon save millions every year.  Our cloud solution portfolio is also the broadest and most deep in the industry  - covering public, private, hybrid, Infrastructure, platform and applications clouds. It is no coincidence therefore that the Oracle Cloud today offers the most comprehensive set of public cloud services in the industry.  And to a large part, this has been made possible thanks to our years on investment in creating cloud enabling technologies. I will dedicated the third and final part of the blog “Clouds, Clouds Everywhere But not a Drop of Rain” to Oracle Cloud Technologies Building Blocks and how they mapped into our vision of Enterprise Cloud. Stay Tuned.

    Read the article

  • AMD Unleashes Six-Core Desktop CPU

    <b>Hardware Central:</b> "AMD today announced the availability of a new six-core desktop processor and platform to accompany it, which includes a new chipset and support for hobbyists who like to tweak their processors to the limits of their heat sink and warranty."

    Read the article

  • Keeping local folders synchronized

    - by Earthling
    After repeatedly losing data on encrypted drives due to some trivial combination of software and hardware failure, I would like to know if there is a simple tool that keeps local folders synchronized. Like a local "cloud" service that runs on one computer and synchronizes any changes in one folder to the other folder as soon as both folders are available. That way I can keep a copy of the most important files on a different hard-drive.

    Read the article

  • Dual Monitor Lock Screen Problem

    - by Justin Carver
    Ubuntu 12.04 Nvidia GTX 550-Ti x-swat Nvidia driver Problem: Using 2 monitors. When screen locks there is a blue box on top of the wallpaper and password dialog box that hides the field for entering your password or switching users. Problem is on 2 systems with similar hardware (Nvidia card, x-swat driver, dual monitors) You can still type your password in blindly and hit enter to login but it's irritating to not be able to see the dialog box.

    Read the article

  • Linux: Configure Xorg X11 Window System

    <b>nixCraft:</b> "My xorg.conf file is missing as I deleted accidentally for some reason. Now, Xorg try to probe my hardware on every startup. How do I configure Xorg under Debian or any Linux distro / operating systems?"

    Read the article

  • Oracle Partner Trainings: Dezember 2012 & Januar 2013

    - by A&C Redaktion
    Im Dezember und Januar finden wieder interessante Oracle Trainings für Partner statt. Hier ist der Überblick über die Themen: 2 vertriebliche Trainings zu Oracle on Oracle 3 online Seminare zur Oracle Datenbank 2 technische Trainings zur Oracle Datenbank 3 technische Trainings zu Oracle Fusion Middleware 1 online Seminar zu Oracle Hardware Die jeweiligen Termin, Anmeldelinks und weitere Informationen finden Sie hier.

    Read the article

  • My sound is not working

    - by gkhan
    I am not able to hear any sound. Does anyone have a clue how to proceed in fixing this? Issuing aplay gives this $ aplay -l **** List of PLAYBACK Hardware Devices **** card 0: NVidia [HDA NVidia], device 0: VT1708S Analog [VT1708S Analog] Subdevices: 2/2 Subdevice #0: subdevice #0 Subdevice #1: subdevice #1 card 0: NVidia [HDA NVidia], device 1: VT1708S Digital [VT1708S Digital] Subdevices: 1/1 Subdevice #0: subdevice #0

    Read the article

  • Intel to Unleash Atom-ic Power at Computex

    <b>Hardware Central:</b> "Intel plans to introduce a series of new Atom processors at the opening of the giant Computex show in Taipei this week, as well as offer a preview a number of other offerings. But Atom will be the star of the show."

    Read the article

  • In a GUI based Application in Linux It is working properly in some systems,But segmentation fault (Because of SIGSEGV signal) is coming in others.Why? [closed]

    - by Sreejith
    The application consists of Driver code,a Source Object file(.so) ,and a Application code to interact with a hardware Card.. The problem comes in a mmap().It reads address from a card. But it is not getting the correct address in some systems.The Error is because of It is receiving a SIGSEGV signal and segmentation fault followed to that.But in some system which having the same version of kernel is not at all facing the problem and working properly. So please any one suggest the Reason and Remedy for this Problem.

    Read the article

< Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >