Search Results

Search found 4224 results on 169 pages for 'dual gpu'.

Page 86/169 | < Previous Page | 82 83 84 85 86 87 88 89 90 91 92 93  | Next Page >

  • Black frame around screen after HDMI connection failure

    - by Wolter Hellmund
    I was trying to watch a movie in my computer through the TV, so I connected both with an HDMI cable. I was unable to have a successful setup (the colors were all weird on the TV and the screen size, incorrect), I tried many resolutions using the nvidia-settings application and somehow my screen got framed by a black border and after that I have been unable to remove it, even after restarting the computer and not being connected to the HDMI cable anymore. I am using Ubuntu 11.10 amd64, my GPU is an nVidia GeForce 8600M GT and I am using the propietary driver version 280. The problem is due to some setting with my account only. I logged in to the guest session and the resolution is right there. Also, my desktop "thinks" the resolution is right (i.e. 1280x800), but it must be right in another scale because there is pixel area occupied by the black frame.

    Read the article

  • Bumblebee optirun appears to depend on Intel

    - by user206398
    I have a Lenovo T420 with Intel and Nvidia graphics. On upgrade to Ubuntu Saucy, I had to purge and reinstall bumblebee-nvidia to get beyond optirun failing to find a GPU driver. Now, "optirun glxgears" and "optirun sol" succeed, but optirun fails on 2 Virtual Life viewers that it supported in the past, Cool VL (CoolVLViewer-1.26.8.34-Linux-x86) and Imprudence (Imprudence 1.4.0 beta2). In both cases, the error output is huge, but it starts with libGL error: failed to load driver: i965 and libGL error: failed to load driver: swrast From the little I can discover, i965 is an Intel graphics driver, which should not be invoked at all. I haven't found any information about swrast. I suspect that some of the X configuration associated with Bumblebee has some Intel dependence that is invoked on certain library calls, but not others. I haven't discovered any definite information on this line. The Cool VL Viewer runs without optirun, but complains about the insufficiency of the Intel graphics.

    Read the article

  • XBMC is slow and sluggish in Ubuntu 13.04; how can I speed it up?

    - by Dreamdealer
    I have a Zotac ZBOX ID84 with Intel D2550 proc and Nvidia GT520M GPU, 2GB memory and 320GB hard disk. I tried XBMCbuntu first and it ran perfectly for a few months until I started to mess around with the Terminal. After a re-install I could get the HDMI sound to work again so I installed the latest version of Ubuntu (13.04) instead. That worked fine. The sound and everything worked right away, but the interface of Ubuntu is sluggish and XBMC doesn't run as good as it did in XBMCbuntu. The video playback is slow and it stutters. It speeds up and slows down with the complexity of the video. So, the PC is more than capable to run XBMC and play the videos, but something in Ubuntu (GUI?) slows it down to an unusable pace. Can anybody tell me what I can do to speed things up? Since I'm new to Ubuntu I have no clue where to start looking.

    Read the article

  • How many BasicEffects do you have in a Game? What is the best way to render multiple objects/shapes at once?

    - by Deukalion
    I'm trying to understand 3D rendering and it seems that everytime you render a new object (A 3D Cube or something) you need to have a new BasicEffect for each Box you render unless you want the exact same texture? ...so if I have over a hundred boxes with each different textures, I need at least as many BasicEffects? Will that not be "too much" for the CPU/GPU in the end or result in lagging? Is there any good way to render multiple objects (cubes or other shapes) at the same time? I've tried changing the BasicEffect.Texture with each cube drawn, but it resulting in changing the first Cube's texture too. Any suggestions would be really appreciated, I'm really new to 3D in XNA so I'm trying to wrap my head around the best methods for example render a Map with objects (of shapes).

    Read the article

  • PowerXpress error with Driver Catalyst. How can I fix it?

    - by J03Bukowski
    I have install Ubuntu 11.10 64bit on my Hp Dv6-3150el. My Notebook has two graphics cards: lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation Core Processor Integrated Graphics Controller (rev 02) 01:00.0 VGA compatible controller: ATI Technologies Inc Madison [AMD Radeon HD 5000M Series] I tried to install the proprietary graphics drivers ''fglrx'' available in "Additional Drivers", which does not give me 3D graphics acceleration (and I can't install those post-release). Then I can try to download and install from the website (I tried the version that Catalyst 11.8 11.12). The installation goes perfectly (I followed this guide and others), except that when I configure Xorg.conf file: sudo aticonfig --initial PowerXpress error: Cannot stat '/usr/lib64/fglrx': No such file or directory Failed to initialize libglx for discrete GPU

    Read the article

  • How to configure screens in console or create screen configuration profiles?

    - by uncle Lem
    I have two monitors and integrated GPU (Intel® HD Graphics 4600). It works fine for work or movies, but if I launch games in fullscreen mode - I get artifacts, glitches and so on. Temporary disabling second monitor solve this problem, but then I have to enable it back, and set its properties manually (by default, additional screen attaches its left-top corner to main monitor's right-top corner, but I need it to be left-bottom and right-bottom corners). So I need some kind of automatization here. Best option - tool to create and swap between some kind of config profiles. Or, maybe, some console manipulations which I can put into script files would be fine too. (Ubuntu 13.04, if it matters)

    Read the article

  • Installed Ubuntu 14.04. How do I get it working with my new AMD R9 280X. When I run it I get garbage on the display

    - by user289455
    I installed Ubuntu 14.04 on my Gigabyte A88X FM2+ motherboard and A10 7700K processor. Everything worked perfectly. This system is dual booted with Windows 8.1 for gaming and I installed subsequently I installed an AMD r9 280X gpu. Windows was easy. It kept working with the new card installed and I just needed to download and instal the driver. Reboot and it's done. Ubuntu boots into a blank display. I enter the password and can log in. The menu on the left and bar at the top are visible but when I try to open an app, I get blocks and rubbish on the display. My preference is not to remove the card. So, my question is: What would be the procedure I could follow where I can log in using a text based terminal and download and install the AMD drivers.

    Read the article

  • Developing Games for Samsung Smart TV

    - by Caner Öncü
    We are planning to develop a game for Samsung Smart TVs. Although those TVs support Flash and HTML5 other specs fail at supporting a game engine. For ex: Using an engine that needs GPU is not possible with the default Samsung smart tv set. Or... WebGL is supported with Samsung SDK 4.1 but we don't know if SDK 4.1 is available for Smart TV series between 7000-9000 or not. We have tried to communicate with Samsung but they don't really seem to respond. Is there anyone who has developed a game for Samsung Smart TVs? If there is, can you name the game engines that can work with those TVs?

    Read the article

  • Computer crashing after a few minutes of logging in

    - by user88612
    I have a fresh install of 12.04.1 on my machine. It is constantly crashing after logging in. I noticed it crashed repeatedly when using Minitube and would crash in Chrome. It seems to be that anything graphics related is causing it? I've booted the system 7 times and each time it would crash within 5-10 minutes after logging in. I have no swap and it's a dual-boot system with Windows 7. My system specs: AMD Phenom XII 925 8GB DDR3 (new) AMD Radeon HD 7770 GPU (new) Seagate 1TB Barracuda HDD (new) Catalyst Drivers

    Read the article

  • How can I make zaz save its profile data?

    - by RolandiXor - The Ice Man
    I've been playing Zaz recently as a time waster and stress beater, but it seems not to be actively maintained, and is not saving profile data under Ubuntu 12.10. It's getting to be more stressful than fun, because it keeps crashing, and under Unity, Gnome Shell, KDE (in other words, any opengl enabled WM) it makes the GPU lockup. How can I make it save the profile data or create a profile that I can manually place my level info in? I'm tired of playing the same levels over and over and not being able to start from the ones I've already passed. I am still yet to find any info on fixing this. Any clues?

    Read the article

  • Y Axis inverted on vertex output

    - by Yonathan Klijnsma
    I've got my project running and somehow it seems my vertex y components are inverted. 10 in the positive on Y goes down and 10 negative on the Y axis goes up. I can't find anything with the initialization and I am not doing any negative scaling in the view matrix. I've never had something like this happen before, does anyone have some tips or things to look for ? How I am sending verteces to the GPU ( Currently intermediate mode ) glVertex3f( x_pos_n, 10, z_pos ); I am using CG in the project but even without shaders the Y axis seems to be inverted.

    Read the article

  • What techniques can I use to render very large numbers of objects more efficiently in OpenGL?

    - by Luke
    You can think of my application as drawing a very large ball-and-stick diagram (or graph). At times, this graph can get very large, where the number of elements even outnumbers the pixels on the screen. Currently I am simply passing all of my textures (as GL_POINTS) and lines to the graphics card using VBO's. When the number of elements outnumbers the number of pixels, is this the most efficient way to do this? Or should I do some calculations on the CPU side before handing everything over to the GPU? If it matters, I do use GL_DEPTH_TEST and GL_ALPHA_TEST. I do some alpha blending, but probably not enough to make a huge performance difference. My scene can be static at times, but the user has control over a typical arc-ball camera and can pan, rotate, or zoom. It is during these operations that performance degradation is noticeable.

    Read the article

  • 12.04 Software "RAID 0" on desktop replacement, 2 HDD?

    - by gregzeng
    Hardware: HP Pavilion DV7 notebook: 8GB DDR3, 2x 750GB SATA2 HDD, I7 c+ Radeon GPU, eSATA, Bluray, etc. Currently multiboot with Win7-64 + choice of 5 'buntu-64. Prefer Xubuntu-64-alternate, but not able to install software RAID-0 at the last active partition on both HDDs. Tried many types: real boot partition, etc. All my Linux op sys boot successfully from the extended partitions on both drives, but without RAID of any kind. Theory - yes. But has anyone really succeeded with 12.04 software RAID-0?

    Read the article

  • Write depth buffer to texture

    - by innochenti
    I need to read depth buffer from GPU and write it to texture. How this can be done? Here is how texture for depth buffer is created: depthBufferDesc.Width = screenWidth; depthBufferDesc.Height = screenHeight; depthBufferDesc.MipLevels = 1; depthBufferDesc.ArraySize = 1; depthBufferDesc.Format = DXGI_FORMAT_D24_UNORM_S8_UINT; depthBufferDesc.SampleDesc.Count = 1; depthBufferDesc.SampleDesc.Quality = 0; depthBufferDesc.Usage = D3D10_USAGE_DEFAULT; depthBufferDesc.BindFlags = D3D10_BIND_DEPTH_STENCIL; depthBufferDesc.CPUAccessFlags = 0; depthBufferDesc.MiscFlags = 0; m_device->CreateTexture2D(&depthBufferDesc, NULL, m_depthStencilBuffer); Also, I've got another question: is it possible to bind depth buffer texture as sampler to the pixel shader?

    Read the article

  • Problem installing libva (VAAPI), vainfo fails

    - by satuon
    I'm following a wiki on how to make VLC use the GPU in the Core i3 integrated graphics chipset, but I'm stuck at one of the first steps, installing VAAPI. I installed libva1 and vainfo by using "sudo apt-get install libva1 vainfo", but when I run vainfo it says libva: libva version 0.32.0 libva: va_getDriverName() returns 0 libva: Trying to open /usr/lib/dri/i965_drv_video.so libva: va_openDriver() returns -1 vaInitialize failed with error code -1 (unknown libva error),exit It should say VAProfileH264High : VAEntrypointVLD VAProfileVC1Advanced : VAEntrypointVLD according to the wiki. /usr/lib/dri/i965_drv_video.so doesn't exist in my system, but I installed libva1 with apt-get.

    Read the article

  • Is this CPU usage normal for Xorg?

    - by Samuaz
    I checked System Monitor to see if the frequency of my CPU increases without doing anything and saw that xorg is always using 10-40% of the CPU even if it's not doing much of anything on the desktop or simply surfing the Internet. Is this normal? If not, how can I fix it? I have: a Macbook white 4,1 Core2 Duo running at 2.10 GHz GPU Intel GMA X3100 4GB of RAM Ubuntu 11.04 I am running Unity and I do not have many effects enabled. I have only activated Compiz animations, scale, desktop, some shadows...

    Read the article

  • W520 External monitor setup with Ubuntu 12.10

    - by user108372
    I just installed a fresh Ubuntu 12.10 64-Bit Desktop on my Lenovo W520. It looks like there are a lot of challenges around making it work with out of the box Nouveau drivers or propriety Nvidia drivers or Intel GPU. I looked at couple of notes on how to make it work with Bumblebee with Optimus Nvidia. None of them seems to work for 12.10. Anybody has a solid answer on this? It seems like a lot of people are suffering from this. Here is my xrandr output. Let me know if you need any additional information. Screen 0: minimum 320 x 200, current 1920 x 1080, maximum 8192 x 8192 LVDS1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 193mm 1920x1080 60.0*+ 59.9 50.0 1680x1050 60.0 59.9 1600x1024 60.2 1400x1050 60.0 1280x1024 60.0 1440x900 59.9 1280x960 60.0 1360x768 59.8 60.0 1152x864 60.0 1024x768 60.0 800x600 60.3 56.2 640x480 59.9 VGA1 disconnected (normal left inverted right x axis y axis) Thanks, Sef

    Read the article

  • Nvidia Powermizer Performance Levels

    - by jeffrey
    Is there anyway to configure Nvidia Powerimizer performance levels? My current setup has 3 power levels with the lowest one being 50mhz. The problem with this it that it lags compiz when it goes to the lowest performance level 0. Minimizing, maximizing, dragging windows, etc. are all sluggish when it's at the lowest level. Once powermizer leaves level 0 everything is very smooth and runs great. Is there anyway for me to remove level 0 and just run Level the two higher levels 1/2? I don't want to complete disable powermizer, but I can't stand the lagging once powermizer goes into performance level 0. Setting the option "prefer maximum performance" fixes the problem as it disables powermizer, but the GPU is overkill at stock speeds for most desktop use @ 850mhz. intel i5 2500k asus gene-z z68 evga 560ti fpb (driver 295.40) ubuntu 12.04 LTS x64

    Read the article

  • L'Unreal Engine 3 fonctionne sur Windows 8 RT, le moteur d'Epic Games tente de prendre des parts de marché à Unity

    L'Unreal Engine 3 fonctionne sur Windows 8 RT Une réponse logique de la part de Epic Games, face à la récente annonce de Unity. Après l'annonce du support de Windows 8 et Windows Phone 8 par Unity 3D, NVIDIA propose une vidéo montrant la démo porte-étendard pour les plateformes mobiles : Epic Citadel, de l'Unreal Engine 3. Elle fonctionne sur la tablette ASUS Vivo Tab RT, intégrant un NVIDIA Tegra. Pour rappel, ce processeur basé sur l'architecture ARM, combine CPU et GPU sur une même puce. Un des points ...

    Read the article

  • When should I clear an auxilliary render target?

    - by Raptormeat
    I'm using a few different render targets in my game in addition to the back buffer. These other render targets are only used in a few places, for specific tasks. I'm wondering when I should be clearing them. Right now I clear all of my render targets at the beginning of the frame, and it seems like I'm waiting for all the textures to clear before the rest of the drawing gets underway. Would it be more efficient to clear these textures later in the frame, when they aren't being used? Is there any hope of the GPU sort of clearing them "on the side" while unrelated rendering is happening? Or are these tasks always sequential and will I always need to wait for clearing?

    Read the article

  • Kinect Fusion bientôt accessible au public, le système de numérisation 3D temps réel sera intégré au prochain SDK Kinect pour Windows

    Kinect Fusion bientôt accessible aux développeurs le système de numérisation 3D temps réel sera intégré dans le prochain SDK Kinect pour Windows La prochaine mise à jour du SDK de Kinect pour Windows intégrera Kinect Fusion. Kinect Fusion utilise un capteur Kinect mobile qui permet de capter des données de profondeur et de créer des modèles 3D de haute qualité, comme par exemple la modélisation d'une pièce et de son contenu. L'implémentation s'appuie sur le GPU pour le suivi de la caméra. Quant à la reconstruction de surfaces, elle fonctionne en interactif et en temps réel pour permettre la mise en oeuvre d'applications en réalité augmentée ou l'interaction homme-machine. K...

    Read the article

  • nVidia GT 220 not working properly with Ubuntu 12.10

    - by Glaedr
    I used to enable proprietary nVidia drivers on every previous Ubuntu release to get it working properly (elseway I was forced to a very low resolution and no graphic acceleration), and everything worked fine then. In particular, I noticed - still I don't know why - that the GPU fan on every OS gets noisy until the video drivers are loaded (runlevel 5, it seems, in Linux), and then slows down to a normal speed. Today I installed 12.10. Running the Live CD, surprisingly, everything was working fine: full resolution, acceleration, silent fan, and so on. The running driver was nvidia-current (GT 216). After installing and booting I found that the fan was overrunning. The installed driver is nouveau. I tried installing nvidia-current, or any other proprietary driver, even installing the kernel headers & source and then the drivers (as suggested here), but all I'm getting with proprietary drivers is, the irony, low resolution, noisy fan and no acceleration (thus obviously unity and compiz refusing to start). Does anybody know a way out?

    Read the article

  • Xbox Surface : bientôt une tablette 7 pouces dédiée au jeu ? Microsoft plancherait sur le sujet

    Xbox Surface : bientôt une tablette 7 pouces dédiée au jeu ? Microsoft plancherait sur le sujet Après Apple avec l'iPad Mini, ce serait au tour de Microsoft de se lancer dans la conception d'une tablette de petite taille (7 pouces). À la différence d'autres dispositifs du même type, la tablette de Microsoft sera spécialement optimisée pour le jeu. Selon un article du magazine The Verge des sources proches de Microsoft, le développement du projet aurait même déjà bien évolué. La tablette xBox Surface reposera sur une architecture ARM spécifique avec une bande passante mémoire importante, afin de pouvoir répondre aux besoins d'un GPU plus puissant que celui disponible ...

    Read the article

  • Intel travaille sur un processeur à 48 coeurs pour mobile, la puce pourrait être disponible dans 5 ans

    Intel travaille sur un processeur à 48 coeurs pour mobile la puce pourrait être disponible dans 5 ans Intel est surtout connu pour ses processeurs pour PC. Mais, dans le secteur des smartphones et tablettes, le constructeur est à la traîne. Une situation que la firme veut changer en apportant des alternatives innovantes aux solutions actuelles. En effet, les chercheurs de la société travaillent actuellement sur une meilleure façon d'utiliser et gérer un grand nombre de coeurs dans un appareil mobile. De nos jours, les terminaux mobiles utilisent des processeurs double-coeurs ou au plus quadri-coeurs avec plusieurs GPU. Les travaux d'Intel pourraient about...

    Read the article

  • Recommended Books for OpenGL [closed]

    - by TheBlueCat
    I'm fairly new to OpenGL and I'm have been researching any books that would be beneficial, I've had suggested to me (I've finished reading the OpenGL Book online): Real Time Rendering GPU Gems 3 OpenGL Super Bible Does anyone know any other books that they've found useful in the past, even if it covers higher level algorithms. Also, can anyone suggest an IDE/Text Editor for Linux? I'm using Komodo and it's super buggy, I just booted into Windows today and tried Visual Studio and loved it, is their anything similar for Linux? Although, the books I've been reading are saying to not use IDEs, partly because of the reliance you place onto them, per se. I use Eclipse a lot for my Java programming, can I use C and OpenGL with that? Lastly, do you think it would be more beneficial staying on Windows and programming in C/OpenGL on their, I do like Linux, but I found Visual Studio to be pretty good in some aspects?

    Read the article

< Previous Page | 82 83 84 85 86 87 88 89 90 91 92 93  | Next Page >