Search Results

Search found 955 results on 39 pages for 'gpu accelration'.

Page 7/39 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • How to have many Ubuntu workstations centrally managed?

    - by Richard Zak
    I have about a dozen Alienware workstations that are used for CUDA development and for execution of MPI jobs. What is the best way to manage them? I'd like to have something like an apt-get but for several systems, and a way to reimage a system simply and centrally. It seems that a combination of Landscape and Canonical's MAAS would be a good fit, but I need an open source solution. Any thoughts?

    Read the article

  • HLSL - Creating Shadows in 2D

    - by richard
    The way that I create shadows is by the following technique: http://www.catalinzima.com/2010/07/my-technique-for-the-shader-based-dynamic-2d-shadows/ But I have questions to HLSL. The way that I currently do it is, I have a black and white image, where Black means 'object', and white means 'nothing'. I then distort the image like in the tutorial. I do this with a pixel shader, but instead of rendering to the screen, I render to a texture, back to my application. I then take this, and create the shadows, and then send it back to the graphics card to undo the distortion, after the shadow has been added - this comes back and I have a stencil of shadow. I can put this ontop of the original image and send them back to the graphics card, which then puts them on the screen. To me this is alot of back and forth. Is there a way i can avoid this? The problem that I am having is that I need to basically go through all positions in the texture 3 times, and use the new new texture every time instead of the orginal one. I tried to read up on Passes, but i don't think that i am heading in the right direction there. Help?

    Read the article

  • How do GameEngines stop Pixel Seams appearing in adjacent mesh boundaries due to FP imprecision?

    - by ufomorace
    Graphics cards are mathematically imprecise. So when some meshes are joined by their borders, the graphics card often makes mistakes and decides that some pixels at the seam represent neither object, and unwanted pixels appear. It's a natural behaviour on all graphics cards. How are such worries avoided in Pro Games? Batching? Shaders? Different tangent vectors? Merging? Overlaping seams? Dark backgrounds? Extra vertices at borders? Z precision? Camera distance tweaks? Screencap of a fix that ended up not working:

    Read the article

  • Where are run the Opengl commands?

    - by Lucas
    Hi, i'm programming a simple OpenGL program on a multi-core computer that has a GPU. The GPU is a simple GeForce with PhysX, CUDA and OpenGL 2.1 support. When i run this program, is the host CPU that executes OpenGL specific commands or the ones are directly transferred to the GPU ???

    Read the article

  • Newly compiled (patched with -ck) 3.3 kernel won't boot up

    - by Shiki
    Screen is garbled. Reaaaally garbled. Like some sort of scrambling image from a movie. I can upload some pictures if needed. To be short with the description. Here is my 3.3 + ck .config file: http://pastebin.com/aP2F9RcH The dmesg log of the failed kernel: http://pastebin.com/esc2Rtka The computer is a ThinkPad T500 laptop. I have selected everything needed for the computer (as far as I can tell, but check the .config). The normal generic Ubuntu kernel boots and works perfect!

    Read the article

  • Rendering a DOM across multiple displays

    - by meetamit
    I'm building a data-driven animation with HTML and javascript to run in a web browser. I would like to display it tiled across three 1080p monitors. This essentially yields a viewport that's 5760px wide and 1080px tall. Pretty large. Does anyone have experience setting up something like this? I have many questions below, but any tip would be appreciated: Is it reasonable to expect a DOM to render into such a large viewport size at close to 60fps? I might choose to use canvas, instead of SVG or HTML, but that would yield a giant canvas. Can a canvas with such high resolution be performant? Of course everything depends on the complexity of the graphics I want to render, but I'm looking to remove that factor from this question, so assume I'm asking about a canvas animation that can run at 60fps at 1920x1080 resolution. Would it run roughly as fast at 3 times the width? Would three.js and WebGL be a more proper approach at that resolution? How do you actually cause Chrome or FF to span 3 monitors at full screen? Do I need a 3rd party solution of any kind? Thanks!

    Read the article

  • Are VM-based languages becoming viable for Graphics since the move to GPU computing?

    - by skiwi
    Perhaps the title is not the most clear, so let me elaborate it more: I am talking about VM-based languages, by that I mean languages that run on the JVM (java) and for example C#. Also I am talking about 3D graphics, just to be clear. Lately the trend has been that most computing is being done on the GPU and not on the CPU, and since times the issue with programming games on a VM-based language is that garbage collecting may happen randomly. So let's take a look which is responsible for what: Showing the graphics: GPU Uploading graphics to the GPU: CPU? Needs to be done every frame? Calculating physics constraints: GPU Doing the real game logic (Determining when to move objects (independent of physics calculations), processing AI): CPU Is my list actually correct? And if it is, is for example Java becoming more viable? Or is uploading the graphics (vertices) still the most expensive operation? Would like to get more insight into this.

    Read the article

  • Which driver should I use with a 9600 GT Mobile?

    - by lisalisa
    I checked the Additional Drivers offered for my card and I saw four options: NVIDIA accelerated graphics drivers (version 173) NVIDIA accelerated graphics drivers (post-release updates) (version 173-updates) NVIDIA accelerated graphics drivers (version current) (Recommended) NVIDIA accelerated graphics drivers (post-release updates)(version current-updates) I'm not sure which one to pick. I want a driver that's stable but also takes advantage of my card's hardware.

    Read the article

  • Xen VGA passthrough: is it possible to have dual GPU's with single monitor

    - by user489481
    I hate dual-booting between Windows when I want to play a game and Linux when I need to work. I don't fancy running Linux in a VM and Wine almost never works for me. So, I'm thinking about buying a new mainboard and CPU that is compatible with Xen's VGA passthrough and HVM virtualization. I want to be able to switch between the Linux Dom0 and Windows DomU with ease, but the problem is I need two GPU's and have only one monitor. Right now I have a nVidia GTX 260 in my PC, but I also have a Radeon 4850 sitting in my closet that I was going to sell. My question is, can I salvage the Radeon card to run the Dom0 and have the nVidia card run the DomU while having some kind of device/software to connect them a single monitor? Power supply and airflow issues aside. If not, what would you suggest? Is this even possible?

    Read the article

  • HTML5 - Does it have the power to handle a large 2D game with a huge world?

    - by user15858
    I have been using XNA game studio, but due to private reasons (as well as the ability to publish anywhere & my heavy interest in isogenic engine), I would like to switch to HTML5. However, I have very high 2D graphic demands for my game. The game itself will have a HDD size of anywhere between 6GB (min) to 12GB (max) which would be a full game deployed offline. The size of the images aren't significantly large, so streaming would be entirely possible if only those assets required were streamed as needed. The game has a massive file size because of the sheer amount of content. For some images or spritesheets, they would be quite massive. (ex. a very large Dragon, which if animated in a spritesheet would be split into two 4096x4096 sheets or one 8192x8192 sheet). Most assets would be very small, and about 7MB for a full character with 15 animations in every direction (all animations not required immediately) so in the size of a few hundred KB to download before the game loads. My question, however, is if the graphical power of HTML5 is enough to animate several characters on screen at once, when it flips through frames quite rapidly. All my sprites have about 25 frames per animation, 5 directions (a spritesheet for each direction & animation), and run at 30fps. Upon changing direction, animation, or a new character entering, spritesheets would change and be constantly loading/unloading. If I pack all directions in a single sheet, it would be about 2048x2048 per sheet. Most frameworks have no problem with this, but I am afraid from what I read that HTML5's graphical capabilities will limit me. Since it takes significant time simply to animate characters in any language, I'd like a quick answer.

    Read the article

  • How can I stop pixel seams appearing in adjacent mesh boundaries due to floating point imprecision?

    - by ufomorace
    Graphics cards are mathematically imprecise. So when some meshes are joined by their borders, the graphics card often makes mistakes and decides that some pixels at the seam represent neither object, and unwanted pixels appear. It's a natural behaviour on all graphics cards. How are such worries avoided in Pro Games? Batching? Shaders? Different tangent vectors? Merging? Overlaping seams? Dark backgrounds? Extra vertices at borders? Z precision? Camera distance tweaks? Screencap of a fix that ended up not working:

    Read the article

  • How do professional games avoid showing pixel seams in adjacent mesh boundaries due to decimal imprecision?

    - by ufomorace
    Graphics cards are mathematically imprecise. So when some meshes are joined by their borders, the graphics card often makes mistakes and decides that some pixels at the seam represent neither object, and unwanted pixels appear. It's a natural behaviour on all graphics cards. How are such worries avoided in Pro Games? Batching? Shaders? Different tangent vectors? Merging? Overlaping seams? Dark backgrounds? Extra vertices at borders? Z precision? Camera distance tweaks? Screencap of a fix that ended up not working:

    Read the article

  • How to set up multiple GPUs (12.04)?

    - by Brother Erryn
    I have two GPUs: one Intel i915 integrated, and one NVIDIA 560 Ti. This is NOT a hybrid setup, nor a laptop. In Windows 7, each card is connected to a different monitor, with the NVIDIA doing any "heavy lifting". For the life of me I cannot get Ubuntu to recognize the i915, but when logging off or rebooting, the shutdown actually appears on the i915. lshw lists both. I'm running the "current" NVIDIA drivers (not the experimental), but Displays calls its monitor a "Laptop". Is this even possible under Ubuntu? The only things that even looked like potential solutions were for laptop hybrid setups and used Bumblebee, but that doesn't seem to apply here.

    Read the article

  • No video signal after install

    - by tW4r
    Today I tried to install Ubuntu 12.10 to my machine, but all the time I encountered the same problem, I successfully boot from Ubuntu 12.04 DVD, and a purple background comes up with accessibility and keyboard icons at the bottom of the screen, then when you wait a while a monitor text comes up "No signal" (Keep in mind that this signal is only shown when there's cable in the monitor HDMI plug), and nothing happens, a little later disk drive stops flashing meaning that disk finished being read. Even if you reset the computer you still get no signal, you have to turn the power of and on again, and plug HDMI out and put it in again. My info: Graphics card: ATI/AMD Radeon HD 4850 Connection to monitor: HDMI Monitor: Samsung SyncMaster T220HD

    Read the article

  • What's the maximum safe temperature for a HD Radeon 6870?

    - by Adrian Grigore
    I'm running a passively cooled HD Radeon 6870 in my PC. While using 3D Acceleration, the temperature climbs up to 95 degrees Celsius according to SpeedFan. It seems a bit hot, but on the other hand I've seen other GPUs being specified to run up to 120 Degrees Celsius. The system is very stable, but Battlefield 3 crashes every few hours or so. On the other hand it might be the game's fault and not related to the GPU temperature at all. Does anyone know where I can find some manufacturer specs on the maximum allowed temperature for this GPU? Thanks, Adrian

    Read the article

  • Is this a graphics card failure?

    - by Alexander Lozada
    I own an older 32-bit Dell XPS 410 that I intended to use for gaming. Currently, I have a ATI Radeon 4000 series card installed, along with a Core 2 Duo and Ubuntu 12.10. I'm fairly certain that my problem is a GPU failure - but before I spend money on a new one, I want to make sure other components aren't the problem/if it would be cheaper just to buy a new PC. Here are my symptoms: Computer power button remains orange (usually green when successfully started.) A fan gets increasingly faster until powered off. Monitor remains black when started, receiving no signal. When powered on, the computer will power itself off near instantly, and then turn on again. If startup is successful, sometimes the screen will become jittered and unusable unless restarted. Is this just a GPU failure, or something more extensive like the motherboard?

    Read the article

  • Graphics card failure, anything I could try...

    - by ILMV
    My gaming PC has decided to die, it's not the first time but usually a quick ATX reset brings it back to life. Today it didn't. I disconnect all unessasary devices so I've only got the case button / LED cables, GPU, CPU, RAM and power connected, the computer still didn't turn on. I've not got a speaker on my motherboard so found a spare one I have for testing and when the machine starts up I get one long beep and two short beeps from my Award BIOS, which apparently means a video card error. I change it with the GPU from another machine and all works well. Q: So I have a faulty graphics card (an nVidia 8800GT OC), is there anything I can try to resurect it?

    Read the article

  • Lag spikes at full CPU usage, maybe video card

    - by Roberts
    I am posting this thread in hurry so few things may be missed (I will update tomorrow). My PC specs: Motherboard Name - Gigabyte GA-945PL-S3 CPU Type - DualCore Intel Core 2 Duo E4300, 1800 MHz (9 x 200) OS - Microsoft Windows 7 Ultimate OS Kernel Type - 32-bit OS Version - 6.1.7601 I bougth a new video card one month ago. GeForce 210. I didn't have any problems. I wanted to overclock it, in other words: "Play with it". So I installed Gigabyte EasyBoost from CD and overclocked the GPU 590 + 110 mhz, memory to max to 960mhz from 800mhz. Benchmarks showed a little bit bigger score. Then I overclocked shader clock from 1405 to [..] (don't remeber really). So I was playing Modern Warfare 2 when off sudden computer froze when I wanted to select team, I was afk before that. I had to reset CMOS. After that I had problems with Skype: unread messages and no sound. Then I figured it out that when ever I open EasyBoost - Skype starts to glitch again. Now I use EVGA Precission X. Now after a month, I cleaned computer and closed the case, it was open all the time. I started to overclock GPU clock only (just a bit) because there was no problems that would stop me. So sometimes on heavy CPU load graphics starts to lag. Dragging a window is painful to watch too. Sometimes the screen freezes for 5 to 10 seconds (I can see that hard disk activity is maximal). You may say that CPU fault it is, isn't it? But sometimes lag spikes starts randomly when CPU load is at maximum. All 3 benchmark softwares (PerformanceTest, NovaBench and MSI Kombustor) shows that performance of my video card has dropped about 25%. BUT! CPU score is lower too. I ignored these problems but when I refreshed Windows Experience Index I was shocked. Month before (in latvian language but not so hard to understand): Now (upgraded RAM): This happened when I tried to capture Minecraft with Fraps on underclocked GPU to 580mhz (def: 590mhz): All drivers are up to date. Average CPU temperature from 55°C to 75°C (at 70°C sometimes starts these lag spikes). Video card's tempratures are from 45°C to 60°C (very hard to reach 60°C). So my hope is that the video card is fine, cause this card is very new and I want to upgrade CPU anyways. Aplogies for my mistakes in vocabulary (I am trying to type this as fast I can).

    Read the article

  • Can't install CUDA drivers for GeForce GT555M

    - by saeed
    I've just bought a new Asus n55 laptop. It has 2 graphics cards from Intel and NVIDIA. But when I try to install CUDA's developer driver for my GPU I get this error: "This graphics driver could not find compatible graphics hardware". I have downloaded both of the following files but both of them get mentioned error: Developer Drivers for WinVista and Win7 (270.81) Notebook Developer Drivers for WinVista and Win7 (275.33) How can I fix this problem? Actually how can I develop CUDA programs on my NVIDIA GPU?

    Read the article

  • System Lags/Freezes when under high usgage

    - by tom
    I am not sure if its my GPU / Memory or Hard drive thats failing. For example if I'm runnning more than one instance of chrome and running an application that takes up a lot of resources, my system will start to lag and freeze. When I launch Photoshop the GPU feature disables automatically, this also lags when I click on menus and when working on documents in Photoshop. I really dont know where to start, if i should buy a new graphics card or test the memory or could it be my OS drive? System: Windows 7 64bit, ATI Raedeon 5850, Corsair 2x4GB http://i.stack.imgur.com/qqkLZ.jpg

    Read the article

  • Toshiba Satellite A305-S6861 Display Problems

    - by brock029
    Well this is the first Laptop I have ever worked on with a dedicated video card. So, there is no video going to the laptops monitor or to an external. Ripped it apart found the gpu and now am stuck. I cant decide if its the gpu that has gone out or the motherboard. Any one have any suggestions? If it were a desktop I would throw in one of my spare video cards. Mainly I don't want to order the video card and eat the $50 if its the motherboard.

    Read the article

  • Does a dedicated video card improve HTML5 websites, Skype or Flash games performance?

    - by Kiewic
    I have read that having a dedicated video card (GPU) improves performance if you use your computer to play video games. I guess to make this happen, video games or apps must be using especial libraries designed to share the workload with the GPU, maybe Direct X or OpenGL, I don't know. Am I wrong? So, can HTML5 websites, Adobe Illustrator, Flash games (Zynga games), Skype or Netflix benefit from a dedicated video card? I usually do the previous activities simultaneously. Should I consider changing from an integrated video card to a dedicated card if I want to improve performance? Thanks.

    Read the article

  • Ubuntu 11.10 doesn't detect Intel integrated graphics (i7-2670QM CPU)

    - by Telmo Marques
    The laptop I'm using is an MSI GT683DX-847PT that comes with an NVIDIA GeForce GTX570M discrete GPU, and an Intel Core i7-2670QM CPU. According to Intel's description of the Core i7-2670QM CPU, it has an HD Graphics 3000 integrated GPU. The problem is that the Intel integrated graphics GPU doesn't come up in lspci nor in lshw, only the NVIDIA GPU shows up. Here is the output of both commands: sudo lspci: http://pastebin.com/raw.php?i=9AZg8bJy sudo lshw: http://pastebin.com/raw.php?i=6cAMFQsY I was counting on having two GPU's to run CUDA programs on the discrete NVIDIA GPU, while X was handled by the integrated Intel GPU, to prevent kernel execution timeout. Why doesn't the Intel HD Graphics 3000 GPU show up? Any tests I could make to verify the presence of an integrated GPU?

    Read the article

  • Resolving a BSOD/CPU/GPU issue...

    - by Christian Sciberras
    Hello all, I'm getting a BSOD / system crash (sometimes the PC just quits without a BSOD). Hardware Specifications cpu: i7 920 2666MHz / 8 cores (not OCed afaik) mobo: Asus P6T SE ram: 2x Corsair CM3X2G1333C9 (64bit DDR3 667MHz) gfx: ATI Radeon HD 5970 1GB (XFX HD5970 BE) os: Windows 7 Ultimate 64 bit (legit) All bios, firmware and drivers are all up to date (as of today). Symptoms Sometimes the PC runs smoothly, sometimes I get this BSOD. The BSOD always happens when I'm doing something related to graphics, such as viewing a video or playing a game. I get to know about the imminent BSOD ~10 seconds earlier; the PC starts freezing occasionally but increasing in frequency and length of lag (I noticed processor usage in creased from Process Monitor). I've tweaked BIOS settings occasionally but afaik, it was in vain. A day or so ago, I reset it to factory settings. BSOD contents The computer has rebooted from a bugcheck. The bugcheck was: 0x00000101 (0x0000000000000019, 0x0000000000000000, 0xfffff88001f35180, 0x0000000000000004). 15-12-2010 A fatal hardware error has occurred. Reported by component: Processor Core Error Source: Machine Check Exception Error Type: Internal Timer Error Processor ID: 4 23-12-2010 A fatal hardware error has occurred. Reported by component: Processor Core Error Source: Machine Check Exception Error Type: Internal Timer Error Processor ID: 2 Important The interesting thing is that although the event log (and BSOD screen) blame a "secondary processor", Windows Action Center sometimes blamed the GFX driver (for the same error). Also It is interesting to note that after hibernating my PC, I always get the BSOD.

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >