Search Results

Search found 1980 results on 80 pages for 'nvidia nforce'.

Page 33/80 | < Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >

  • Enabling hardware acceleration and Xinerama for multi-monitor/multi-GPU in Linux

    - by mynameiscoffey
    My current setup is three monitors connected as follows (monitors listed from left to right): GPU0 (nVidia GTX 280): - Dell 2405FPW (1920x1200) - Dell U2410 (1920x1200) GPU1 (nVidia 210): - Dell 2405FPW (1920x1200) Works like a charm in Windows 7, not so much in Linux. I seem to only have three real options: Run all three monitors as a seperate X screen, I get hardware acceleration but as they are all independent X sessions I cannot move windows between them and can only have firefox open on one at any given time. Run the two on GPU0 in TwinView mode and have GPU1 as a seperate X screen. Same limitation as 1 but at least two monitors work together ok. I did have an issue where occasionally Linux saw both monitors on GPU0 as a single large monitor however. Enable Xinerama and have everything work as I want it to but hardware acceleration is gone and the display is Windows 95 style choppy. My ideal solution would be to have all screens working as they do under Xinerama without the limitation of having hardware acceleration disabled. I don't even care if that means rendering all three on GPU0 and somehow farming out the display of the third monitor to GPU1, whatever works. My question is this: is there any way to accomplish this? I don't feel like my use case is so out there that there shouldn't be at least some form of support (beyond the three limited options presented above), or is my best option going to be to just suck it up and pick up a better card to replace both that can handle three outputs by itself?

    Read the article

  • Screen occasionally flashes black when under load, sometimes does not recover

    - by Oak
    I've built a brand new machine, but to my horror my monitor occasionally flashes black for around a second, then returning to normal. This happens under load (watching videos / playing games) but only sometimes; e.g. it doesn't occur in "Batman: Arkham City" but does in "XCOM: Enemy Unknown". When watching videos, it also occurs when not watching them full-screen, and it sometimes even occurs when the machine isn't doing anything, just sitting at the desktop and moving the mouse around. Has anyone ran into this problem and knows of any solution? Additionally, sometimes after the black screen, it won't return to normal, instead turning completely corrupt. In these cases even quitting the application doesn't help, but physically disconnecting and reconnecting the monitor fixes the problem. This problem did not occur on my earlier machine which used the same physical monitor. Additional details: Windows Server 2012, configured as Windows 8, with latest updates installed NVIDIA GeForce GTX 660 Ti, with latest driver installed Ample amounts of CPU and RAM for playing the above games and for watching videos. I've read about similar problems elsewhere but could not find a working solution: http://www.youtube.com/watch?v=Zt00C-HXFbA&noredirect=1 http://www.sevenforums.com/hardware-devices/59126-monitor-flashing-black.html https://eu.battle.net/d3/en/forum/topic/4079098908?page=4 http://www.tomshardware.com/forum/347422-33-screen-flickering-black-nvidia-driver-update

    Read the article

  • configure a Macbook Pro to use external monitor at boot (Debian Linux)

    - by Eric
    In the spirit of reuse, I've installed Debian (version 6.0.5 "squeeze") on my wife’s old Macbook Pro (circa 2009 or so), to repurpose it for various tasks. The catch is the display is flaky. It will last a random amount of time, between 2 minutes and 2 hours, before freezing and graying out. This is a known issue with that generation of MBP. Fortunately it’s no problem for me, as I plan to use it with an external monitor anyway. Which brings us to the problem: How do I configure this thing to output to the external display by default, and hopefully disable the built-in LCD? The ideal solution would be to modify a setting in the EFI (BIOS), but I’m not holding out much hope for that. Next best thing would be a kernel option I can pass to the NVIDIA driver. What won’t work is a solution that doesn’t give me a display until X starts. I need to have console access, especially given that the built-in LCD is dying, and any day now might give out completely. So far I haven’t been able to find anything online. lspci says I’ve got an NVIDIA GeForce 9400M Help is much appreciated! Eric PS if this question is better suited to the Unix & Linux area, pls advise and I will move it.

    Read the article

  • Monitor flickers in native resolution.

    - by ptikobj
    With my new Samsung Syncmaster BX2450 I have the following problem: In Windows XP (SP2), all resolutions above 1440x900 have either strange pixel errors or an extreme flickering. It seems, that the effect worsens for higher resolutions. In special, I would like to run the monitor with its native resolution (1920x1080), however I can't watch longer than 5 seconds on the monitor because of the flickering... My Graphics Card is a Geforce FX 5200 with the most up-to-date driver (according to Nvidia.com: Forceware 175.19) and I'm having the monitor connected to its DVI-output. The strange thing is, under Ubuntu 10.04, all resolutions work just perfect, so the display must be alright. edit: seems to be a driver problem... if I use the proprietary NVIDIA drivers in Ubuntu, I have the same problem as in Windows. I would like to reformulate my question: Is there a modified/alternative Geforce FX 5200 driver (as there is in Ubuntu) for Windows that allows me to use 1920x1020 without problems ? I already tried the omega drivers: unfortunately, it still looks poor on the native resolution.

    Read the article

  • computer randomly restarting. both in game and out of game

    - by eric
    first my specs are. AMD Phenom II x4 955 processor 3.2ghz 20gb ddr3 ram 4Gb Nvidia Geforce GTX 770 850w Corsair tx850w psu Gigabyte ud3 mobo Windows 7 professional I recently uprgraded my vid card to gtx770 and upgraded my psu to the 850w thats in it now. i did a reformat with the installation of the new gpu and psu and started fresh and only have a couple programs installed (diablo3, nvidia control panel, wow, and steam). all drivers are up to date and everything is hooked up correctly. the problem is it will randomly shut down. no blue screen. just turns itself straight off and reboots after a couple seconds. occasionally i will have to unplug the power cable from the psu for a few minutes then reconnect and it will start up. it seems pretty random. sometimes it does it when my pc is just sitting there on the home screen. and sometimes it does it during games. and sometimes it doesnt do it for days at a time. i noticed the psu felt hot so i put an extra fan blowing straight onto both the psu and gpu and neither feel overly hot after it shuts down now. could it just be that it is a psu problem. the psu was taken from another machine but wasnt having this problem in that machine. i have seen a few articles online about gtx770 doing the same thing. but i havent found any answers or solutions. any help will be appreciated. im sure the 850w is enough to power my machine, im just stumped and ran out of ideas to fix it. i have even returned the video card for another thinking it might have been an issue with that particular card, but still gettin the same problem.

    Read the article

  • Hyper-V Blue Screens with Nvidia GeForce 8400 GS Graphics Card

    - by Mahmoud Saleh
    I am using Windows Server 2008 R2 Enterprise x64. After installing the Hyper-V role and restarting the machine, I get a blue screen error and an immediate reboot. I have Googled the issue and tracked it down to the graphics card, so I uninstalled it, and then Windows loads fine. However, after installing the graphics driver again, the Blue Screen returns. The graphics card is an Nvidia GeForce 8400 GS. Does anyone know how I can resolve this issue?

    Read the article

  • Intel et Nvidia signent un accord de partages de certaines de leurs technologies, afin d'enterrer un procès vieux de deux ans

    Intel et Nvidia signent un accord de partages de certaines de leurs technologies, afin d'enterrer un procès vieux de deux ans Intel vient de s'engager à verser, à l'amiable, la somme de 1.5 milliard de dollars à Nvidia. Pour quelle raison ? Afin de clôturer un litige qui avait débuté en février 2009 suite à une plainte d'Intel contre Nvidia (affirmant que son concurrent ne possédait pas la licence nécessaire pour fabriquer des chipsets de carte-mère pour ses derniers processeurs. L'affaire s'était poursuivie avec une contre-plainte de Nvidia, qui retirait à Intel l'accès à certains de ses brevets concernant les processeurs graphiques tout en invoquant une rupture de contrat. Et tout ceci s'était, bien sur, envenimé par voie ...

    Read the article

  • Intel et Nvidia signent un accord de partages de technologies, pour enterrer un procès vieux de deux ans

    Intel et Nvidia signent un accord de partages de certaines de leurs technologies, afin d'enterrer un procès vieux de deux ans Intel vient de s'engager à verser, à l'amiable, la somme de 1.5 milliard de dollars à Nvidia. Pour quelle raison ? Afin de clôturer un litige qui avait débuté en février 2009 suite à une plainte d'Intel contre Nvidia (affirmant que son concurrent ne possédait pas la licence nécessaire pour fabriquer des chipsets de carte-mère pour ses derniers processeurs. L'affaire s'était poursuivie avec une contre-plainte de Nvidia, qui retirait à Intel l'accès à certains de ses brevets concernant les processeurs graphiques tout en invoquant une rupture de contrat. Et tout ceci s'était, bien sur, envenimé par voie ...

    Read the article

  • I have problem on 12.10 64bit with nvidia graphics driver opensource and proprietary ! Can't change resolution on 3d games

    - by digitalcrow
    I have problem with nvidia graphics driver opensource and proprietary ! Can't change resolution on 3d games and there's a bug there are no kernel sources installed and thus jockey can't install proprietary drivers. I can't change resolution while on games fullscreen. I tried to install proprietary driver and i couldn't plus i couldn't login to ubuntu it showed only the desktop photo no dash no nothing. -tried to install kernel sources and blacklist+remove Nouveau drivers and then installed the proprietary drivers i hope succesfully. -The problem is the same i can't change resolution on 3d games while on fullscreen. I've installed the sources and be able to install the proprietary nvidia drivers but the problem remains, look what i got in the output of a 3d game exited while i tried to change resolution: X Error of failed request: BadValue (integer parameter out of range for operation) Major opcode of failed request: 150 (XFree86-VidModeExtension) Minor opcode of failed request: 10 (XF86VidModeSwitchToMode) Value in failed request: 0x25b Serial number of failed request: 497 Current serial number in output stream: 499 I'm giving you more details about my system: i have an nvidia geforce gts 250 , 3,4 ghz quadcore amd phenom2 , 8gbytes of ram. The output of: sudo lshw -C display; lsb_release -a; uname -a Is the following: *-display description: VGA compatible controller product: G92 [GeForce GTS 250] vendor: NVIDIA Corporation physical id: 0 bus info: pci@0000:01:00.0 version: a2 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress vga_controller bus_master cap_list rom configuration: driver=nvidia latency=0 resources: irq:18 memory:fa000000-faffffff memory:d0000000-dfffffff memory:f8000000-f9ffffff ioport:ef00(size=128) memory:fb000000-fb01ffff No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 12.10 Release: 12.10 Codename: quantal Linux darkpc 3.5.0-17-generic #28-Ubuntu SMP Tue Oct 9 19:31:23 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux I like ubuntu's style and user interface but i hate the bad quality of work they do.

    Read the article

  • NVIDIA Parallel Nsight disponible gratuitement pour Visual Studio pour tirer partie des GPU dans les calculs complexes

    NVIDIA Parallel Nsight disponible gratuitement pour Visual Studio Pour tirer partie des cartes graphique dans les calculs complexes NVIDIA vient de mettre la version 1.51 de son outil Parallel Nsight à la disposition des développeurs utilisant Visual Studio. Sorti par NVIDIA en août dernier, Parallel Nsight permet de développer des applications CUDA C/C++ ou DirectCompute tirant parti de la puissance de calcul des cartes graphiques. La nouvelle version de Parallel Nsight Professional Edition peut-être utilisée avec Visual Studio 2008 ou 2010. Dans un souci de rendre son installation plus simple, NVIDIA a également procédé à la suppression de la clé de licence et d'a...

    Read the article

  • CG miner "configure: error: No mining configured in "

    - by Jorma
    Nvidia Gt 630 cuda 5.5 running CGminer not. Cuda examples fine. Should CGminer work or is there limitations to it? sudo ./autogen.sh --disable-cpumining --enable-opencl && make Configuration Options Summary: libcurl(GBT+getwork).: Enabled: -lcurl curses.TUI...........: FOUND: -lncurses Avalon.ASICs.........: Disabled BlackArrow.ASICs.....: Disabled BFL.ASICs............: Disabled BitForce.FPGAs.......: Disabled BitFury.ASICs........: Disabled Hashfast.ASICs.......: Disabled Icarus.ASICs/FPGAs...: Disabled Klondike.ASICs.......: Disabled KnC.ASICs............: Disabled ModMiner.FPGAs.......: Disabled configure: error: No mining configured in

    Read the article

  • Hardware for 4 Monitors

    - by Simon
    Looking to build several systems to output to 4 monitors over DVI. I notice most of the recent Nvidia and ATI/AMD cards have dual-DVI ports. Can I simply install two of these cards to get four monitors - or are only some cards capable of running side-by-side with another? How can I tell before purchasing? Simon.

    Read the article

  • How can I set Windows Developer Preview resolution to 1920x1080 screen resolution

    - by Eugenio Miró
    I have a new monitor which I'm sharing between two computers: a laptop with Windows 7 (Win7 from now on) and a home assembled desktop with Windows Developer Preview (WinDev from now on). Win7 recognizes the monitor resolution of 1920x1080, and many others. WinDev doesn't; the best resolution I get on that OS is 1600x1200. Win7 has: Mobile Intel 965 Express video Generic PnP Monitor as monitor #2 WinDev has: MSI NVidia 9600GT with 1Gb RAM video Generic Non PnP Monitor I'm using Belkin Flip USB to share the monitor between boxes

    Read the article

  • Cheapest dedicated PhysX card?

    - by davr
    Say I have a nice powerful ATI Radeon GPU but I would like to have PhysX support as well. What is the cheapest NVidia GPU I can buy to add on to my system that will support PhysX? (I already know about the driver hacks required to get both cards running at once)

    Read the article

  • Better graphics or better cpu on a budget laptop.

    - by jones
    Which would have better overall performance on a cheap (~$600) laptop; Intel Atom 330 with Nvidia ion or intel Pentium/Celeron with Intel graphics. I don't need 8 hour battery life and will hopefully be using this for programming/web browsing and occasionally light gaming.

    Read the article

  • How to configure multiple video cards in linux?

    - by Jader Dias
    In Ubuntu Lucid Lynx RC, I got NVidia's TwinView to work with 2 monitors in a single video card. But when I use the same monitors but split them between the video cards I can't make TwinView work and it starts a X server for each monitor. I want the same effect I had with one video card.

    Read the article

  • Video memory buswidth vs video memory Bandwidth

    - by Mixxiphoid
    My current video card (9600GT) is dying and I'm searching for a new video card. Between acquiring my current one and now, I got a lot more knowledge about hardware and I want to use that to pick my new card. So I decided to not just buy some popular card blindly, but to search for a card able to handle my hardware requirements. I searched the specs at the NVidia site for the GT640 and was confused by the memory section and some questions raised. My current card's memory bus width is 256bit and has 1GB of memory. I checked Google about the importance of bus width. And all the links basically said the same 'The higher the number the more potential simultaneously traffic can be transferred'. This was already clear to me, yet there are currently a lot of new cards which are considered better than my current one with a lower bus width. To go in more detail about my question I copied the memory info from the NVidia site: GT 640 GT640 GDDR5 Memory Specs: Memory Clock 1.8 Gbps 5.0 Gbps Standard Memory Config 2048 MB 1024 MB Memory Interface DDR3 GDDR5 Memory Interface Width 128-bit 64-bit Memory Bandwidth (GB/sec) 28.5 40.0 What puzzled me is that the Memory Bandwidth seems to me the most important part, yet the lower bus width has the higher 'performance'. Is this due to the fact the memory interface is GDDR5 and is therefore able to have a higher memory clock speed (5Gbps)? If I am to buy a new video card, should I check the bus width? Memory clock? Bandwith? Amount of memory? My current card ahs 1GB memory, so I was searching for a 2GB memory card, but now I'm not so sure any more whether that is really 'better'. My main question: To me it seems that memory performance is made up by the combination of bus width and frequency. Is this true? If yes, why are there so many sites telling me I need to get a card with a high bus width? If no, then what IS important when it goes about memory performance on a video card. NOTE: The memory bandwidth is (almost) never displayed on vendor sites. How can I determine which card is better without knowing the bandwith?

    Read the article

  • Workstation Card

    - by david
    I am going to buy an HP EliteBook 8740w. The problem is it comes with Nvidia quadro 2800m and ati 7820m firepro 3d? please advise which is better?

    Read the article

  • Forcedeth - too many iterations (6) in > nv_nic_irq

    - by RyanC
    Hey, I'm having trouble with an onboard nvidia gigabit network, under times of heavy load on the network, I'm seeing this error logged: "too many iterations (6) in nv_nic_irq" I'm running Hadoop DFS over these NICs and I see checksum errors build up until the whole thing just fails. I'm running the 2.6.26-2-amd64 kernel, and my initial research seems to imply its a problem with the forcedeth driver. Has anyone run into this problem before? Thanks in advance if anyone can help! Ryan

    Read the article

  • Looking for Ubuntu 10.10 driver for GeForce GT 425M GPU.

    - by Fantomas
    It came with my Sony VAIO® VPCF133FX/H 16.4" Notebook. Ubuntu does suggest an NVIDIA driver for me, but when I install it, I cannot boot back in normally. I have to boot into a failsafe mode, then reset graphics setting to default, and reboot again. Right now I am stuck in 800 x 600 mode, but I would like to do better, and take advantage of my 1GB graphics memory :(. Please let me know if you have questions.

    Read the article

< Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >