Search Results

Search found 955 results on 39 pages for 'gpu'.

Page 21/39 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • Brightness not working; HP Pavilion Dv6; ATI Radeon HD6770M

    - by Yogesh Dhamija
    I am new to Ubuntu, but so far I am loving it. I was always unable to change my brightness since I installed Ubuntu, but I figured that installing the latest ATI driver for my graphics card would work. I did, but I still can't change the brightness. The slider goes up and down, but the brightness stays the same (on full). I have switchable graphics, an ATI Radeon HD 6770M, and an Intel integrated GPU. Since I am new to Linux, I am not familiar with terminal, so you will have to spell everything out for me, including if you need more information and how to get it. Thanks.

    Read the article

  • Ubuntu 14.04LTS - runtime video card configuration through Radeon driver

    - by RJVB
    How does one configure Radeon video cards when using the open source Radeon driver - power profile, vsync, etc? Why I try the widely documented solution (against overheating) that worked for me under LMDE (confirmed with kernels up to 3.12.6), I get the following error: $ sudo cat /sys/class/drm/card0/device/power_profile default $ sudo sh -c "echo mid > /sys/class/drm/card0/device/power_profile" sh: echo: I/O error Exit 1 And when I try suggestions from Arch's ATI wiki my modifications are simply ignored: $ sudo cat /sys/class/drm/card0/device/power_dpm_force_performance_level auto $ sudo sh -c "echo high> /sys/class/drm/card0/device/power_dpm_force_performance_level" $ sudo cat /sys/class/drm/card0/device/power_dpm_force_performance_level auto Is this something Ubuntu specific, or something introduced with the 3.13 version of the Radeon driver? I'm encountering this on 2 laptops, one with a Radeon HD6290 (integrated GPU), the other with a discrete RV710 card. The RV710 needs a specific power setting to prevent overheating under LMDE, fortunately it doesn't seem to overheat with the Ubuntu default setting.

    Read the article

  • The practical cost of swapping effects

    - by sebf
    Hello, I use XNA for my projects and on those forums I sometimes see references to the fact that swapping an effect for a mesh has a relatively high cost, which surprises me as I thought to swap an effect was simply a case of copying the replacement shader program to the GPU along with appropriate parameters. I wondered if someone could explain exactly what is costly about this process? And put, if possible, 'relatively' into context? For example say I wanted to use a short shader to help with picking, I would: Change the effect on every object, calculting a unique color to identify it and providing it to the shader. Draw all the objects to a render target in memory. Get the color from the target and use it to look up the selected object. What portion of the total time taken to complete that process would be spent swapping the shaders? My instincts would say that rendering the scene again, no matter how simple the shader, would be an order of magnitude slower than any other part of the process so why all the concern over effects?

    Read the article

  • Getting vga_switcheroo with ATI Mobility Radeon 5650 HD to work

    - by stevejb
    Hello! I have a new HP dv7 laptop with ATI Mobility Radeon HD 5650 graphics card, and also Intel graphics (switchable). I have done the following and want to understand what is going on with my graphics driver Resized windows 7 and did fresh install of 10.10 Booted into 10.10 and things seemed to be working okay Enabled ATI graphics, and was clearly working on the ATI rather than Intel GPU (desktop cube worked) Rebooted, got an error that modprobe could not load modules.dep, and also something about i915 symbols Rebooted into recovery mode, modified xorg.conf to remove the mention of fglrx Rebooted, and the errors show, but then x starts but clearly in intel graphics I would ideally like to be able to switch between the ATI and Intel graphics, a la vga_switcheroo. My first problem seems to be that the folder /sys/kernel/debug/vgaswitcheroo does not exist, hinting at some kind of kernel issue. What can I do to get this available? Thanks!

    Read the article

  • Black screen after upgrading from 13.04 to 13.10

    - by Harri
    Just upgraded from 13.04 to 13.10 and all I got was a black screen. The hardware I'm running is Asus Zenbook UX31A (Intel GPU). I do hear that the login screen drums do play, so the system does boot to login screen. When I try to boot using kernel 3.11.0-12 recovery mode, it tells me "initctl: event failed". Then if I go on an press ctrl+alt+f2, log in and command startx, it dies because "Fatal server error: no screens found". Here are some logs from /var/log/Xorg.0.log http://pastebin.com/ZQasUKJx Kernel 3.8.0-31 work ok, as did things before the upgrade.

    Read the article

  • Would like some help in understanding rendering geometry vs textures

    - by Anon
    So I was just pondering whether it is more taxing on the GPU to render geometry or a texture. What I'm trying to see is whether there is a huge difference in rendering two scenes with the same setup: Scene 1: Example Object: A dirt road (nothing else) Geometry: Detailed road, with all the bumps, cracks and so forth done in the mesh Scene 2: Example Object: A dirt road (nothing else) Geometry: A simple mesh, in a form of a road, but in this case maps and textures are simulating cracks, bumps, etc... So of these two, which one is likely to tax the hardware more? Or is it not a like for like comparison? What would be the best way of doing something like this? Go heavy on the textures? Or have a blend of both?

    Read the article

  • My ASUS U32U with fresh Xubuntu install shows a black screen 50-80% of the startups

    - by Jona Ekenberg
    I have recently installed Ubuntu 12.10 with Xubuntu-package on my ASUS U32U notebook (Radeon HD 6320 GPU). The issue I have is that more often than not, after the GRUB-select screen I get a black screen, and three times total white lines (kind of) flashes very quickly (with maybe 5 seconds between each flash). I'm not even able to get to the login-screen (nor the Xubuntu loading screen). At first I thought it was simply me having installed something dumb or messed up some settings, but even after reformatting the partition and installing ubuntu again, the problem remains. Before I formatted it xfce4's window manager wouldn't start either, but it does now (when I am able to see anything). I can access the virual consoles (ctrl+alt+f1), but I can't see anything, but I've managed to shutdown the computer by using it (sudo shutdown -h now).

    Read the article

  • OpenGL behaviour depending on the graphics card?

    - by Dan
    This is something that never happened to me before. I have an OpenGL code that uses GLSL shaders to texture a 3D model. The code involves a lot of GPU texture processing, blending, etc... I wanted to check how the performance of my code improves using a faster graphics card (both new and old are NVIDIA, using always the NVIDIA development drivers). But now I have found that once I run the code using the new graphics card, it behaves completely different (the final render looks wrong), probably because some blending effect is not performed correctly. I haven't really look into what has changed, but I am guessing that some OpenGL states are, by default, set different. Is this possible? Have you ever found different OpenGL/GLSL behaviour using different graphics cards? Any "fast" solution? (So far I've thought of plugging back the old one, push all OpenGL default states, and compare with the ones I initially get using the new card..)

    Read the article

  • Frequent GUI pauses in Ubuntu 13.04 / Unity / Intel HD4000

    - by Simon
    I'm experiencing very frequent (and regular) GUI pauses on my system. Every 30 seconds (pretty much exactly) the GUI will freeze for maybe .25 to .5 seconds. The mouse stops moving, keys stop echoing and a stopwatch timer briefly pauses. I'm using the Intel Graphics driver available from: https://download.01.org/gfx/ubuntu/13.04/main I've looked in a few places and tried a few things for a solution: I've checked cron and anacron for scheduled processes. I've disabled background processes (eg mysql, postgres, apache) not that these were doing anything anyway I've checked the following posts and tried the suggestions there: Unity GUI pauses/freezes for less than a few seconds How to go about troubleshooting frequent system pauses I've watched the system using top and System Monitor and there are no spikes (or even blips) of cpu usage when the pauses occur. There are no obvious error messages in dmesg or syslog There is loads of free RAM (8GB+) and no swap usage If it helps it's a ZooStorm i5 laptop with a HD4000 GPU, 16GB Ram and an SSD. Any help / suggestions would be very gratefully received.

    Read the article

  • What the different hardware temperatures listed in psensor, sensor viewer etc reffer to?

    - by cipricus
    I have installed psensor and see a list of temperatures, but listed as ”Temperature 1”, 2, 3 etc . I can only guess where the processor is: but who's who for sure? The same question stands for Sensors Viewer. I can also type sensors in Terminal but I get no more than that acpi -t gives Thermal 0: ok, 65.0 degrees C Thermal 1: ok, 37.9 degrees C Thermal 2: ok, 56.0 degrees C Thermal 3: active, 71.0 degrees C Considering psensor, I know for a fact that: - the temperature that varies most depending on the CPU use is Temp1 and it is one of the two highest - the other high temperature is Temp4 and it goes to the ceiling when using youtube/flash - Temp2 is very stable at a medium level of 50-60 degrees Celsius - Temp3 is by far the lowest and most imobile So, I guess Temp1 is the CPU temperature, and Temp4 is the GPU temperature. Temp2 and 3 must be the motherboard and the hdd. Does anybody know for sure?

    Read the article

  • Android: Layouts and views or a single full screen custom view?

    - by futlib
    I'm developing an Android game, and I'm making it so that it can run on low end devices without GPU, so I'm using the 2D API. I have so far tried to use Android's mechanisms such as layouts and activities where possible, but I'm beginning to wonder if it's not easier to just create a single custom view (or one per activity) and do all the work there. Here's an example of how I currently do things: I'm using a layout to display the game's background as an image view and the square game area, which is a custom view, centered in the middle. What would you say? Should I continue to use layouts where possible or is it more common/reasonable to just use a large custom view? I'm thinking that this would probably also make it easier to port my code to other platforms.

    Read the article

  • pwmconfig: "There are no pwm-capable sensor modules installed"

    - by Sman789
    I'm trying to reduce my fan speed with fancontrol and pwmanager because, despite the temperatures being the same, they are much louder on Linux (Ubuntu Gnome 14.04) than on Windows. I've followed the instructions in the first answer here but when running pwmanager I get pwmconfig: "There are no pwm-capable sensor modules installed" I know that my system has working thermal sensors because PSensor has no trouble telling me my CPU temp and GPU temp. I would appreciate any help you can give in helping me reduce my fan speed to that of Windows (which uses the ASUS AI Suite 3 software which came with the Z87-A motherboard, if that's relevant).

    Read the article

  • What is the situation about OpenGL under Ubuntu Unity and Gnome3?

    - by user827992
    In a GNU/linux distribution is usually installed Xorg as main graphical server, it operates with a client-server logic, a special windows is designate as desktop environment and this special windows can handle all the eyecandy stuff like decorations, icons and effects. The problem is that the latest UI heavily relies on hardware acceleration, Unity is an overlay on Compiz and the Gnome-shell also require an active driver for the GPU to work well: the problem is: on the same OS I can find multiple implementations of OpenGL who is handling my OpenGL buffer? how the OpenGL buffer is managed compared to the other windows? how can I be sure that my OpenGL implementation is glued to the hardware and is not related to the client-server logic of Xorg? For example I have tried the clutter library and I have only experienced problems under both Unity and GTK/Gnome, no problem under other OS.

    Read the article

  • downgrade ppa packages to versions available at a previous point in time

    - by Will
    The backstory is that the normal Intel GPU drivers don't do the various OpenGL extensions that my hobby coding and some games want. So I have to install xorg-edgers and then its happy. However, last Wednesday or so there was an update to xorg-edgers - lots of packages - and it broke badly; the drivers lock up and take the whole computer with them; hard reset required. So how can you downgrade - select package versions in a PPA that represent a point in the past, ignoring versions newer than that?

    Read the article

  • How can I generate signed distance fields (2D) in real time, fast?

    - by heishe
    In a previous question, it was suggested that signed distance fields can be precomputed, loaded at runtime and then used from there. For reasons I will explain at the end of this question (for people interested), I need to create the distance fields in real time. There are some papers out there for different methods which are supposed to be viable in real-time environments, such as methods for Chamfer distance transforms and Voronoi diagram-approximation based transforms (as suggested in this presentation by the Pixeljunk Shooter dev guy), but I (and thus can be assumed a lot of other people) have a very hard time actually putting them to use, since they're usually long, largely bloated with math and not very algorithmic in their explanation. What algorithm would you suggest for creating the distance fields in real-time (favourably on the GPU) especially considering the resulting quality of the distance fields? Since I'm looking for an actual explanation/tutorial as opposed to a link to just another paper or slide, this question will receive a bounty once it's eligible for one :-). Here's why I need to do it in real time: There's something else:

    Read the article

  • La beta de Moonlight 4 se rapproche de Silverlight 4, l'implémentation open-source ajoute accélération matérielle et support du H.264

    La beta de Moonlight 4 se rapproche de Silverlight 4 Son implémentation open-source propose désormais accélération matérielle et support du H.264 Moonlight 4 vient de sortir en version beta. L'implémentation open-source de Silverlight propose à présent l'accélération matérielle (pour la prise en charge des vidéos et de la 3D par le GPU) ou le support du codec H.264. Avec cette version de développement, Moonlight intègre plusieurs nouveautés de Silverlight 4, notamment la prise en charge des APIs de Silverlight 3 et 4. Elle permet également de construire et de faire tourner des applications « hors du navigateur ». Néanmoins, cette beta ne propose pas toutes les foncti...

    Read the article

  • Loud fans despite cool system under Linux (but not Windows)

    - by Sman789
    My new desktop computer runs almost silently under Windows, but the fans seem to run on a constantly high setting under Linux. Psensor shows that the GPU (with NVidia drivers) is thirty-something degrees and the CPU is about the same, so it's not just down to Linux somehow being more processor-intensive. I've read that the BIOS controls the fans under Linux, which makes sense given the high fan speeds when in BIOS as well. It's under Windows, when the ASUS AI Suite 3 software seems to take control, that the system runs more quietly and only speeds the fans up when required. So is there a Linux app which offers a similar dynamic control of the fans, or a setting hidden somewhere in the ASUS BIOS which allows the same but regardless of the OS? EDIT - I've tried using lm-sensors and fancontrol, but pwmconfig tells me "There are no pwm-capable sensor modules installed". This is after the sensors-detect command does find an 'Intel digital thermal sensor', and despite the sensors working fine in apps like psensor. Help getting this to work would likely solve the problem.

    Read the article

  • OpenGL Vertex Attributes - Normalisation

    - by Daniel
    Alas, I have searched, and have found no definitive answer. When would you normalize the vertex data in OpenGL using the following command: glVertexAttribPointer(index, size, type, normalize, stride, pointer); I.e when would normalize == GL_TRUE; what situations, and why would you choose to let the GPU do the calculations instead of preprocessing it? All examples I have ever seen, have this set to GL_FALSE; and I cannot personally see a use for it. But Khronos aren't stupid, so it must be there for something useful (and probably common).

    Read the article

  • Firefox 4 : sortie de la beta 12, améliorations du support du Flash et de l'accélération matérielle

    Firefox 4 : sortie de la beta 12 Améliorations du support du Flash et de l'accélération matérielle Mise à jour du 28/02/11 La douzième ? et a priori dernière - beta de Firefox 4 est sortie ce week-end. Elle corrige 7.000 bugs et apporte une amélioration dans la lecture des vidéos (en Flash). L'intégration de l'accélération matérielle (allouer des tâches spécifiques de calcul au GPU plutôt qu'au CPU) a elle aussi été retravaillée. Le tout permettant une meilleure stabilité du navigateur. Elle n'inclut malheureusement pas encore les patchs « miracles*» qui permettent de diviser par deux son temps de démarrage (lire par ail...

    Read the article

  • What is the ideal laptop for creative coding applications?

    - by Jason
    Hi, I am a creative coder using C++(cinder and OpenFrameworks) I am looking to upgrade from my MacBook, which slowed down to about 3fps this morning. My project involves particles systems and fluids reacting to audio analysis data and computer vision data in real-time. SD or HD? no biggie. I have asked many people what computer I need. Ideally, I want a MacBook Pro. But is that enough power? I've been told that I need a desktop for what I am doing though I'd rather stay portable I've been told that I should go PC linux to get the most power but I'd rather stay mac I've been told that RAM is more of bottleneck than processor speed I've been told that the Graphics Card is more important than CPU and that code optimizations such as using trees over lists, proper threading, sending tasks to the GPU make a bigger difference than the hardware!!! what's true?! what do I need? Any suggestions are greatly appreciated

    Read the article

  • internal error message pops up after each time system is rebooted

    - by Biju
    I had installed ubuntu 12.04 using wubi. But each time i boot the system an internal error message pops up. As show below:- Executable path /usr/share/apport/apport-gpu-error-intel.py Package xserver-xorg-video-intel 2:2.17.0-1ubuntu4 Problem Type crash Apportversion 2.0.1-0ubuntu7 and so on.. I had earlier upgraded to ubuntu 12.04 from ubuntu 11.10. And encountered the same issue. Hence i uninstalled the OS and reinstalled using wubi. I had posted the same query in ubuntu.com/support (Question Number: 195525) But couldnt find a solution. I am using dell inspiron with intel pentium. Need ur help in resolving this issue. thanking u, Biju

    Read the article

  • What are the factors that determine the default frequency of a shader call?

    - by user827992
    After i have been played for some days with various vertex and fragments shaders seems clear to me that this programs are called by the GPU at every and each rendering cycle, the problem is that I can't really quantify this frequency and I can't tell if is based on some default values or not because I don't have a big collection of hardware right now to do extensive tests. For what i know the answer could be really trivial like "it's the same of the refresh rate of your monitor", but i would like some good answers on that to be clear on this. For instance looks really odd to me that all the techniques used to control the amount of FPS that i have seen until now uses a call for the OpenGL function glutGet(GLUT_ELAPSED_TIME) to retrieve a value in ms about when the rendering started but I have to relies on the CPU to do the math. Why I can't set an FPS value in OpenGL if OpenGL clearly has a counter and a timer/clock? PS I'm referring to OpenGL 3.0+

    Read the article

  • Avoiding lag when rendering Texture2D for first time

    - by Emir Lima
    I have found a similar question here, but it is about playing sounds. I am using 2048 x 2048 textures for sprite sheets and every time I call spriteBatch.Draw using a sheet for the first time in game execution, causes a considerable lag. The lag doesn't appears for the next times. Someone has faced this problem before? What can I do to overcome this? Update: I inserted a code in the end of content load routine that draws EVERY Texture2D that is loaded into ContentManager before follow to the game screen. This works well. None lag occurs when different textures are rendered over the time, EXCEPT if the IsFullScreen are changed. Apparently, changing this property makes the textures loaded in the GPU gone. Is that correct?

    Read the article

  • 12.10 live dvd no video input

    - by mark kirby
    Hi I have been trying install Ubuntu 12.10, but as soon as it gets past my bios and to the screen with the blinking line in the top left, I get a no video input message on my tv (like when you turn the tv on with nothing connected). I have used live dvd's of both betas, alphas and daily build all with exactly the same results. Has any one else had this ? Is there a fix ? Dose this mean I can never upgrade my Ubuntu again ? (12.04 works ive been using since beta) My pc ,while old, should run this fine CPU = 2x Intel P4 HT @ 3ghz GPU = Nvidia Geforce 310 via HDMI RAM = 2 Gb DDR 2 HDD = 2 x 7200 rpm SATA Please help me I use Ubuntu exclusively on my pc and would like to keep doings so.

    Read the article

  • How to solve dual monitor issue, which happens only during X start?

    - by tamashumi
    When is loading and two monitors are connected, instead of a login screen I see this: ...after clicking OK, selection appears: Then I'm following to console login, disconnecting by hand the secondary monitor cable, restart lightdm with a command sudo service lightdm restart ...and voila! System loads fine. If I disconnect the cable before boot X will be loaded fine too. It's not a nice 'feature' when I have to disconnect the cable each boot or X restart. I was trying to delete monitors.xml but it didn't help. The situation relates to my notebook with Intel integrated GPU. The same happens on two different pairs of monitors: at the office and at home. How can I fix this? Ubuntu 12.04 x64 Desktop with default Unity GUI.

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >