Search Results

Search found 4224 results on 169 pages for 'dual gpu'.

Page 19/169 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • can I create disk partition for dual-boot Ubuntu on Windows 7 machine without Windows reinstall?

    - by EndangeringSpecies
    I want to setup dual boot Ubuntu on my machine in a separate partition. Plus, ideally, I want to get another, 3rd, partition for further OS experimentation. The hard drive is huge, hundreds of gigs, and essentially unfilled. The machine runs Windows 7 Home. Online I have seen mention of creation of partitions from inside Windows 7. But, I have also heard claims that to create the partition to house Ubuntu Windows has to be reinstalled, frying all the data on the machine. So, which one of these claims are right? Can you create additional partitions for other OS on a big Windows 7 hard drive without reinstall?

    Read the article

  • Why Are We Still Using CPUs Instead of GPUs?

    - by Jason Fitzpatrick
    Increasingly GPUs are being used for non-graphical tasks like risk computations, fluid dynamics calculations, and seismic analysis. What’s to stop us from adopting GPU-driven devices? Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-drive grouping of Q&A web sites. 6 Ways Windows 8 Is More Secure Than Windows 7 HTG Explains: Why It’s Good That Your Computer’s RAM Is Full 10 Awesome Improvements For Desktop Users in Windows 8

    Read the article

  • How to partition a fully used hard drive

    - by MineCraftMan39
    I installed Ubuntu on my laptop which when I got it I had Fedora 13 installed on it as the OS. Now I want to install Fedora 18 in a dual boot with Ubuntu. Problem is when I installed Ubuntu, I didn't partition for dual boot and I gave the entire hard drive to Ubuntu and no longer have the space to dual boot. How can I lower the partition on Ubuntu to make space on the hard drive for Fedora? I want to split the hard drive 50/50 between the two. Thanks in advance.

    Read the article

  • GPU optimization question: pre-computed or procedural?

    - by Jay
    Good morning, I'm learning shader program and need some general direction. I want to add noise to my laser beam (like this). Which is the best way to handle it? I could pre-compute an image and pass it to the shader. I could then use the image to change the opacity and easily animate the smoke by changing the offset of the texture lookup. I could also generate noise in the shader and do the same thing the texture was used for. Is it generally better to avoid I/O to the graphics card or the opposite? Thanks!

    Read the article

  • Windows 7 and Ubuntu Boot issue

    - by user115137
    I had the idea to dual boot Win 7 and Ubuntu and what I did was the following: Made a clean install of win 7 using all of my hard drive, next I used the Ubuntu live cd and gparted to partition my drive to be the following: /dev/sda1 ext4 20GB (Linux root) /dev/sda2 ntfs 100GB(Win7) /dev/sda3 ext4 350GB(Home) /dev/sda4 extended 4GB(swap) The thing is, when installing ubuntu I deleted the partition win 7 creates for its boot sector and recovery and then resized the drive to look like what I mentioned, and Ubuntu installed GRUB to the MBR. When GRUB boots I can see Ubuntu but not Windows, how can I chainload it? Or should I fix the windows mbr with the windows 7 installation disk and try to set the dual boot from there? I don't really care which one of the 2 bootloaders I end up using, I just want the dual boot to work out. Thanks

    Read the article

  • Dual monitor not working after an update

    - by Nimonika
    I did a package manager update yesterday and it turns out that my dual monitor setup has stopped working. I have poor vision so I really need to connect to a much bigger screen, but since yesterday, when I connect the screen to my laptop, the screen does not automatically reset itself to the laptop display. Even after lots of trial and error with the display settings, I am getting different dispalys on the laptop and external screen and right now only the big screen is active while the laptop has blanked out. Please can someone help me setup my dual screens for 11.10 properly. lspci -v | grep -i vga output 00:02.0 VGA compatible controller: Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller (rev 07) (prog-if 00 [VGA controller]

    Read the article

  • OS Isolation: Virtualization or Dual-Boot Duplication, a General How To?

    - by Mr_CryptoPrime
    I want to isolate my windows 7 operating system and I have looked into virtualization. This should work with Linux, however, I do want to still have a way to run windows 7 securely, but without significant performance loss, thus eliminating virtualization for that. I know that you can dual boot because I currently do so with my XP/Linux system. Is there a way that I can duplicate my windows 7 system so I can select one at bootup? This way I can ensure that each OS is isolated and not worry about performance loss. However, I am having a lot of trouble finding a solid method for OS duplication?! Is this even possible or must I buy two versions of win7 and somehow install them separately? Any information regarding this would be helpful, thanks! Essentially I want, Two instances of win7 (not necessarily simultaneously running) Each are isolated from one another so that a security breach in one doesn't affect the other. There is no performance loss in either from doing so

    Read the article

  • How to copy VirtualBox VDI contents to a partition and dual boot the OS from it?

    - by Calmarius
    I'm a Linux user but I keep a compressed Windows XP ISO with me on a pen drive for the case I absolutely need Windows to do something. This works in VirtualBox most of the time. But now I want to play some games, so I would like to run the Windows image natively. My computer don't have CD drive so cannot just burn the ISO and make an install normally. What I trying to do is moving the installed Windows image to a physical NTFS partition on my HDD and set up GRUB to let me dual boot it. I found many tutorials that deal with making VDI to physical drive. But they assume I want to overwrite my entire drive. Moving the raw disk image with dd to the partition resulted in a corrupt partition. I also tried the VMDK trick to use that empty partition and install the Windows on it. Although the text mode phase of the installation finishes without problems, the VM won't work, either crashes and keeps rebooting or just immediately or freezes (depending on how I created the VMDK, with -rawdisk /dev/sda3 or -rawdisk /dev/sda -partition 3).

    Read the article

  • AMD Fusion GPU passthrough to KVM or Xen

    - by BigChief
    Has anyone successfully gotten a passthrough working with the GPU portion of AMD's Fusion APUs (the E-350 is my target) on top of a Linux hypervisor? IE, I want to dedicate the GPU to one VM only, excluding all other VMs as well as the host. I know PCI passthrough can work with patches / kernel rebuilds for Xen and KVM. However, since the GPU is on the same chip, I don't know if the host OS will see it as PCI. I know there are a number of tangential issues here, such as: Poor Fusion drivers in Linux at the moment Unsuccessful patching efforts seem common VT-d / IOMMU is required and (from my reading) is supported on the APU, but the motherboard may not offer it KVM doesn't appear to support primary graphics cards, only secondary graphics cards (described here) However, I'd like to hear from anyone who has messed with this, even failed attempts. Fedora + KVM is my preferred virtualization platform but I'm willing to change that if it makes a difference. EDIT: The goal is to do this for a Windows 7 guest (I know it's asking a lot). Regardless, just assume this is HVM, not PV.

    Read the article

  • Switching from Onboard intel to Nvidia Dedicated GPU

    - by Anarkie
    How can I switch from Intel onboard grpahics to Nvidia Dedicated GPU? When I go to windows screen resolution I see intel. I cant change it. I go to Device Manager, I see both Adapters are there and Nvidia is known.I disabled intel, I didnt see any option to set one as primary so I disabled intel, black screen!Reboot and re-enable intel. I right click on the desktop, choose "Nvidia Control Panel" and on 3D options I chose the desired game I want to play, High performance Nvidia, but it didnt switch when I started the game. Then I made preferred GPU in the global settings High performance Nvidia for everything it still didnt change.I understand to save the battery etc. there is a switch option between these two but I dont see this switch when it is necessary, I cant also switch manually?Is there a manual switch FN key?I looked but couldnt find. Why I want to do this? 1) Better game peformance. 2) I want to play an old game from 2002(Diablo 2 LOD), when I start the game there are black bars on the sides, so screen becomes just smaller which I dislike!I heard this is intel's specification to center the display.But instead I would like to scale or expand it to fit widescreen(fullscreen).Which should be possible with Nvidia. My Notebook Specs: Fujitsu Lifebook AH531, Win7 , 64 bit, i5, intel HD graphics onboard, Nvidia GT 525. I didnt install Nvidia later, it was always installed and ready from the moment I turned on the computer first time. How I determined that the cards werent switched when I am playing the game: with the windows key I exited from the game, then looked at screen resolutions menu, still saw intel, also the game was still with black bars.I know intel GPU should enough for Diablo 2 but I am interested in this answer for further games, I dont always play Diablo, what if I install an up to date game for example?Then Intel will not be sufficient.I would like to learn the switch option.

    Read the article

  • How to make my laptop dual boot(Windows Server 2008 and WIndows 7)?

    - by Dinesh
    I have Windows Server 2008 R2 Enterprise(Licensed Copy) installed in my Laptop. I have installed this to evaluate Latest 64 bit Products of Microsoft like Share point 2010 etc. Now i want to install Windows 7 Ultimate(Original) without removing the Server OS. Basically, i want to make my laptop Dual Boot. I tried To install Windows 7, but it not showing any options for making dual boot. Can anybody,please suggest the solution. I need to install immediately.

    Read the article

  • How can I view an R32G32B32 texture?

    - by bobobobo
    I have a texture with R32G32B32 floats. I create this texture in-program on D3D11, using DXGI_FORMAT_R32G32B32_FLOAT. Now I need to see the texture data for debug purposes, but it will not save to anything but dds, showing the error in debug output, "Can't find matching WIC format, please save this file to a DDS". So, I write it to DDS but I can't open it now! The DirectX texture tool says "An error occurred trying to open that file". I know the texture is working because I can read it in the GPU and the colors seem correct. How can I view an R32G32B32 texture in an image viewer?

    Read the article

  • Switching between Discrete and Integrated GPUs

    - by void-pointer
    Hello everyone, I develop CUDA applications on my Alienware M17x portable back-breaker, which has two discrete GTX 285M GPUs and one integrated GeForce 9400M GPU. I can currently switch between them using NVIDIA's software, but I would like the ability to do so within my applications for purposes of benchmarking and general convenience. Apparently this requires the "NDA version" of NVIDIA's Driver API, which I know not how to obtain. Would using this API be the only way to accomplish what I seek, and if so, how would I obtain it? A solution using Windows APIs would also be acceptable, though less preferable to one which would leverage a cross-platform API. I have created a similar thread concerning the matter on NVIDIA's forum, which is down at the time of this writing. Thanks for reading my question; it is much appreciated!

    Read the article

  • Increasing efficiency of N-Body gravity simulation

    - by Postman
    I'm making a space exploration type game, it will have many planets and other objects that will all have realistic gravity. I currently have a system in place that works, but if the number of planets goes above 70, the FPS decreases an practically exponential rates. I'm making it in C# and XNA. My guess is that I should be able to do gravity calculations between 100 objects without this kind of strain, so clearly my method is not as efficient as it should be. I have two files, Gravity.cs and EntityEngine.cs. Gravity manages JUST the gravity calculations, EntityEngine creates an instance of Gravity and runs it, along with other entity related methods. EntityEngine.cs public void Update() { foreach (KeyValuePair<string, Entity> e in Entities) { e.Value.Update(); } gravity.Update(); } (Only relevant piece of code from EntityEngine, self explanatory. When an instance of Gravity is made in entityEngine, it passes itself (this) into it, so that gravity can have access to entityEngine.Entities (a dictionary of all planet objects)) Gravity.cs namespace ExplorationEngine { public class Gravity { private EntityEngine entityEngine; private Vector2 Force; private Vector2 VecForce; private float distance; private float mult; public Gravity(EntityEngine e) { entityEngine = e; } public void Update() { //First loop foreach (KeyValuePair<string, Entity> e in entityEngine.Entities) { //Reset the force vector Force = new Vector2(); //Second loop foreach (KeyValuePair<string, Entity> e2 in entityEngine.Entities) { //Make sure the second value is not the current value from the first loop if (e2.Value != e.Value ) { //Find the distance between the two objects. Because Fg = G * ((M1 * M2) / r^2), using Vector2.Distance() and then squaring it //is pointless and inefficient because distance uses a sqrt, squaring the result simple cancels that sqrt. distance = Vector2.DistanceSquared(e2.Value.Position, e.Value.Position); //This makes sure that two planets do not attract eachother if they are touching, completely unnecessary when I add collision, //For now it just makes it so that the planets are not glitchy, performance is not significantly improved by removing this IF if (Math.Sqrt(distance) > (e.Value.Texture.Width / 2 + e2.Value.Texture.Width / 2)) { //Calculate the magnitude of Fg (I'm using my own gravitational constant (G) for the sake of time (I know it's 1 at the moment, but I've been changing it) mult = 1.0f * ((e.Value.Mass * e2.Value.Mass) / distance); //Calculate the direction of the force, simply subtracting the positions and normalizing works, this fixes diagonal vectors //from having a larger value, and basically makes VecForce a direction. VecForce = e2.Value.Position - e.Value.Position; VecForce.Normalize(); //Add the vector for each planet in the second loop to a force var. Force = Vector2.Add(Force, VecForce * mult); //I have tried Force += VecForce * mult, and have not noticed much of an increase in speed. } } } //Add that force to the first loop's planet's position (later on I'll instead add to acceleration, to account for inertia) e.Value.Position += Force; } } } } I have used various tips (about gravity optimizing, not threading) from THIS question (that I made yesterday). I've made this gravity method (Gravity.Update) as efficient as I know how to make it. This O(N^2) algorithm still seems to be eating up all of my CPU power though. Here is a LINK (google drive, go to File download, keep .Exe with the content folder, you will need XNA Framework 4.0 Redist. if you don't already have it) to the current version of my game. Left click makes a planet, right click removes the last planet. Mouse moves the camera, scroll wheel zooms in and out. Watch the FPS and Planet Count to see what I mean about performance issues past 70 planets. (ALL 70 planets must be moving, I've had 100 stationary planets and only 5 or so moving ones while still having 300 fps, the issue arises when 70+ are moving around) After 70 planets are made, performance tanks exponentially. With < 70 planets, I get 330 fps (I have it capped at 300). At 90 planets, the FPS is about 2, more than that and it sticks around at 0 FPS. Strangely enough, when all planets are stationary, the FPS climbs back up to around 300, but as soon as something moves, it goes right back down to what it was, I have no systems in place to make this happen, it just does. I considered multithreading, but that previous question I asked taught me a thing or two, and I see now that that's not a viable option. I've also thought maybe I could do the calculations on my GPU instead, though I don't think it should be necessary. I also do not know how to do this, it is not a simple concept and I want to avoid it unless someone knows a really noob friendly simple way to do it that will work for an n-body gravity calculation. (I have an NVidia gtx 660) Lastly I've considered using a quadtree type system. (Barnes Hut simulation) I've been told (in the previous question) that this is a good method that is commonly used, and it seems logical and straightforward, however the implementation is way over my head and I haven't found a good tutorial for C# yet that explains it in a way I can understand, or uses code I can eventually figure out. So my question is this: How can I make my gravity method more efficient, allowing me to use more than 100 objects (I can render 1000 planets with constant 300+ FPS without gravity calculations), and if I can't do much to improve performance (including some kind of quadtree system), could I use my GPU to do the calculations?

    Read the article

  • Is there any guarantee about the graphical output of different GPUs in DirectX?

    - by cloudraven
    Let's say that I run the same game in two different computers with different GPUs. If for example they are both certified for DirectX 10. Is there a guarantee that the output for a given program (game) is going to be the same regardless the manufacturer or model of the GPU? I am assuming the configurable settings are exactly the same in both cases. I heard that it is not the case for DirectX 9 and older, but that it is true for DirectX 10. If someone could provide a source confirming or denying it, it would be great. Also what is the guarantee offered. Will the output be exactly the same or just perceptually the same to the human eye?

    Read the article

  • How do I turn off nVidia high performance mode?

    - by gpen06
    I am new to Ubuntu 11.10 and never touched linux until today. I installed Ubuntu alongside with Windows 7. I have an issue with my laptop overheating and freezing when it uses the high performance GPU graphics mode (hybrid card). It is an easy fix for Windows 7. I simply set the graphics mode manually to the low performance integrated graphics (save power) mode using advanced power settings. In Ubuntu, I'm stumped at how to do this. When using Ubuntu, my laptop is showing the same symptoms as it did without the fix for Windows. I have the nVidia graphics driver installed for Windows which allows me to see which mode I am in. I downloaded it off of my laptop's website (ASUS). They do not offer driver downloads for linux. Screenshot of power settings in Windows 7

    Read the article

  • Problems booting Ubuntu 10.10 with Nvdia GeForce 6600gt

    - by SlyrNemesis
    I am quite new to Ubuntu and I already stumbled upon a problem. I run Ubuntu 10.10 Maverick Meerkat now and Install went fine, when trying to boot the screen popped black and I faced this message "gpu lockup - switching to software fbcon." When setting the display to onboard VGA through BIOS Ubuntu has no problem at all, but when I switch back to my AGP card (which is a Nvidia GeForce 6600GT) I get that message again. I do not use VGA for that card I use my DVI cable for it. Does anyone have an answer for me to make this work? Thank you

    Read the article

  • Dual monitors won't accept new settings

    - by mschulze
    I'm trying to use dual monitors but every time I try to change the display settings(external monitor is on the right of the laptop, not the left etc...) my screens go black, then my laptop screen will come back "normally" but my external monitor will just flash different shades of red or black. I cannot interact with anything on my laptop screen even though I can move my mouse around. Even when the monitors attempt to revert back the external monitor continues to flicker instead of reverting. The only way to stop it is to hard-boot but the new settings aren't saved. Also, when the dual monitors are both working my laptop screen has two vertical black bars along its sides. It's like Natty just decided I couldn't use the outer inch and a half of my laptop screen. My external monitor doesn't have this problem, and my laptop uses its full screen size when the external monitor is not hooked up. Does anyone know what I'm doing wrong? I have an HP Pavillion dv7 and an HP w2338h external monitor. Thanks!

    Read the article

  • Problems Dual Boot

    - by user104108
    A few months I decided to install Ubuntu 12.04 on my PC alongside with my Windows 7 partition. In order to do that and avoid any mistake, I followed the steps of these tutorial: http://www.linuxbsdos.com/2012/05/17/how-to-dual-boot-ubuntu-12-04-and-windows-7/2/ Everything was going well until I decided to update to the 12.10 realese. I don't know what happened, but after I updated my Ubuntu, it stoped working, it didn't even launched, when I turned on my pc and choose to run "Ubuntu 12.04" on the Grub Screen, a weird messaged appeared. Well, so I decided to install the Ubuntu 12.10 and forget about the 12.04 partition, no problem. I erased the partitions used for the Ubuntu 12.04 with EaseUS partition Manager. However, when I start my PC, there is still the option of "Ubuntu 12.04" to chose, is that bad? And what about now, can I use the Windows Installer of Ubuntu ( http://www.ubuntu.com/download/help/install-ubuntu-with-windows ) to install the Ubuntu 12.10 ? What should I do to have Ubuntu 12.10 and Windows 7 in dual boot again? Thanks; Thales.

    Read the article

  • Nvidia dual monitor configuration gets lost every time I reboot

    - by sunwukung
    I've recently updated (well, borked then completely reinstalled) to 12.04. I'm running a dual monitor setup, with a Dell U2410 / Dell 2007WFP combination on an HP Elite Book 8560W. The graphics card is an NVIDIA GF108 [Quadro 1000M]. My problem is as follows. I can get the dual monitor setup working fine, but every time I reboot, my machine appears to lose the settings (specifically, the U2410 is disabled, the mouse pointer is locked in the launcher). I have to restate the settings after every launch. I've tried running nvidia-settings as sudo, I've save the changes to my xorg.conf file (see below) but nothing seems to be sticking. Has anyone had similair issues, or know of a fix? Conf file follows: # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 295.33 (buildd@allspice) Fri Mar 30 15:25:24 UTC 2012 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor0" VendorName "Unknown" ModelName "DELL 2007WFP" HorizSync 30.0 - 83.0 VertRefresh 56.0 - 76.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "Quadro 1000M" EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "1" Option "TwinViewXineramaInfoOrder" "DFP-1" Option "metamodes" "CRT: 1680x1050 +1920+0, DFP-1: 1920x1200 +0+0; CRT: nvidia-auto-select +0+0, DFP-1: NULL" SubSection "Display" Depth 24 EndSubSection EndSection The error message I'm getting is this: none of the selected modes were compatible with the possible modes: Trying modes for CRTC 642: CRTC 642: trying mode 3600x1080@50hz with output at 1280 x 1024@0Hz (pass 0) CRTC 642: trying mode 3600x1080@50hz with output at 1280 x 1024@0Hz (pass 0) CRTC 642: trying mode 3600x1080@50hz with output at 1280 x 1024@0Hz (pass 0) CRTC 642: trying mode 3600x1080@50hz with output at 1280 x 1024@0Hz (pass 1) CRTC 642: trying mode 3600x1080@50hz with output at 1280 x 1024@0Hz (pass 1) CRTC 642: trying mode 3600x1080@50hz with output at 1280 x 1024@0Hz (pass 1)

    Read the article

  • Forking a GPL dual licensed software with business owned copyrights

    - by Eric
    After receiving some threats of the copyrights holder of a dual licensed software(GPL2 and commercial) to buy the commercial version for projects in production, I am thinking to make a fork. In a case of GPL2 and commercially dual licensed with business owned copyrights software, is forking the GPL2 version an option? Also, is forking a good way to deal with such cases? Background information The software is a web CMS released under 2 versions a GPL2 free open source edition and a commercial edition including technical support and extra functionality. The problem is that now, basing their argumentation on the "distribution" definition of the GPL2, the company holding the copyrights argue that delivering the software and some extensions to a client is considered as a "distribution". And that such a "distribution" falls under the GPL2 obligation to release the custom made extension code. Custom made extensions are mainly designs, templates and very specific functionality. Basically they give me 3 choices: Buying the commercial licensed edition for projects based on the GPL in production, Deleting all the projects in production based on GPL2 version, Releasing all the extensions as GPL2 code. The first 2 options are nothing realistic for finished projects. The third option could be fine, but as most of the extensions are very specific, cleaning the code to make it usable by other users means lot of works and also I am not sure the clients will appreciate to have their website designs and specific functionality released publicly. The copyrights holding company even contacted some clients directly, giving them the "choice". I know that this is a very corporate interpretation of GPL2, and a such action is nothing close to legal, but as an independent developer, I don't want to take the risk to get involved in some long and tiring legal procedures. PS. This question was first asked on Stack Overflow where it felt out of the scope and closed, after reading the present site FAQ, discussing about software licensing seems fine.

    Read the article

  • Dual Monitor in Ubuntu 11.10 is resetting the theme

    - by Mengu
    i'm experiencing a strange problem with dual monitors. when i set the dual monitor via "nvidia settings" and save the setting to xorg.conf, the default unity theme and icons are turning back to gtk default. i also get an error telling me "could not apply the stored configuration for monitors" here is my xorg.conf: # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 280.13 (buildd@yellow) Fri Aug 5 12:31:28 UTC 2011 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor0" VendorName "Unknown" ModelName "Chi Mei Optoelectronics corp." HorizSync 30.0 - 75.0 VertRefresh 60.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce GT 540M" EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "1" Option "TwinViewXineramaInfoOrder" "DFP-0" Option "metamodes" "DFP: nvidia-auto-select +0+0, CRT: 1680x1050 +1920+0" SubSection "Display" Depth 24 EndSubSection EndSection here is an example of what i'm talking about: http://i.stack.imgur.com/vrlW1.png how can i fix this? thanks in advance.

    Read the article

  • T-Mobile G1 (MSM7200) GPU Memory

    - by Reflog
    Hello. I'm trying to find some information regarding the available GPU (for OpenGL) memory on the T-Mobile G1. This phone has a MSM7200 Qualcomm chip inside with ATI Imageon GPU. Unfortunately I am not able to dig any info regarding the specifics of GPU memory usage. How much memory is available in total for the textures? Is the memory shared with the CPU memory? Thanks in advance, Eli

    Read the article

  • Dual nVidia GPUs (3 monitors) not working in 11.10

    - by jasonmclose
    after searching I have not found a solution. I have 2 nvidia quadro 295 cards with 3 monitors but I can not extend twinview across multiple GPUs. I have the most recent nvidia proprietary drivers installed, and they work fine for the single GPU / dual monitors. I tried using xinerama, but without success. I don't mind switching to the nouveau drivers if that would handle my multiple monitors, although I would like to continue to use unity and compiz if i can).

    Read the article

  • Dedicated GPU in Dell PowerEdge C1100

    - by Eli Gundry
    We recently purchased a Dell PowerEdge C1100 off lease with the initention of using it for graphics processing. We installed an AMD HD 7000 series GPU in it that runs off of board power and it sends video to the display. That said, the video is very choppy, leading us to belive that the onboard video is doing all the processing and sending it to the card. Is there any way to either disable the VGA on this server or tell the OS to only use the dedicated card. More info: The server is running REHL 6.4 The graphics running the proprietary AMD drivers The video card only works in OS and does not show the BIOS on boot (we know that it's impossible to change this) Any ideas, guys? Update We are now thinking that the GPU is doing the graphics processing, but not working at the full speed of the PCI bus. Which is odd, because it is an x16 slot, but probably optimized to use a RAID card (if that makes any sense). Is there any way to remedy the choppy graphics on this server?

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >