Search Results

Search found 8498 results on 340 pages for 'integrated graphics'.

Page 37/340 | < Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >

  • Ubuntu won't fit 10" netbook's native display

    - by Daniel
    I recently removed Windows 7 Starter from my netbook, and replaced it with Ubuntu 12.10. The problem is some bits of the system doesn't fit the native display resolution of 1024x600 i.e. the bottom bits of Ubuntu is hidden beneath the screen & the only 2 available resolutions are: the default 1024x768 and 800x600. I've also thought about replacing Ubuntu with Lubuntu or Puppy Linux, as the system does run a bit slow, but I can't, as then I won't be able to access the taskbar and application menu which will be hidden beneath the screen. Only Ubuntu with Unity is currently usable, as I can see the Unity Launcher. My Netbook model is HP Mini 210-1004sa, which comes with Intel Graphics Media Accelerator 3150, and has a display 10.1" Active Matrix Colour TFT 1024 x 600. I was able to define a custom resolution 1024x600 using the Q&A: How set my monitor resolution? but when I set that resolution, the desktop area is lowered, with bits of it hidden beneath the screen; & there's a black space left at the top of the screen. I had to revert to the old setting 1024x768 to push the desktop upwards and remove the black space.

    Read the article

  • Cannot use second display with 12.04 and Intel 2000/3000

    - by Carolyn Marenger
    I am unable to get anything to display on my second monitor, or even get the system to recognize that there is a second screen. I am running Ubuntu 12.04 on a Gigabyte GA-H61M-S2PV, revision 2.0 box. The integrated chipset is an Intel 2000/3000, and there are both a D-Dub and a DVI-D display ports on the MB. This is the first operating system I have installed on this system. I have a second monitor plugged into the DVI-D port via a DVI-D to D-Sub adapter. I cannot verify that the motherboard or adapter were/are working, short of installing windows to test the theory. When I go into the "System Settings - Displays" control window, it shows one display. I have rebooted with the second monitor attached, and I have perused the BIOS settings in case it might have been disabled. So far, I have had no indication that the second monitor is recognized, not even a flicker at power on. If I swap monitors and cables between the DVI-D and D-Dub ports, the other monitor lights up, so I know the monitor and video cable are not the issue. Any suggestions would be appreciated. Thanks, Carolyn

    Read the article

  • Who can change the View in MVC?

    - by Luke
    I'm working on a thick client graph displaying and manipulation application. I'm trying to apply the MVC pattern to our 3D visualization component. Here is what I have for the Model, View, and Controller: Model - The graph and it's metadata. This includes vertices, edges, and the attributes of each. It does not contain position information, icons, colors, or anything display related. View - This would commonly be called a scene graph. It includes the 3D display information, texture information, color information, and anything else that is related specifically to the visualization of the model. Controller - The controller takes the view and displays it in a Window using OpenGL (but it could potentially be any 3D graphics package). The application has various "layouts" that change the position of the vertices in the display. For instance, one layout may arrange the vertices in a circle. Is it common for these layouts to access and change the view directly? Should they go through the Controller to access the View? If they go through the Controller, should they just ask for direct access to the View or should each change go through the controller? I realize this is a bit different from the standard MVC example where there a finite number of Views. In this case, the View can change in an infinite number of ways. Perhaps I'm shattering some basic principle of MVC here. Thanks in advance!

    Read the article

  • LOD in modern games

    - by Firas Assaad
    I'm currently working on my master's thesis about LOD and mesh simplification, and I've been reading many academic papers and articles about the subject. However, I can't find enough information about how LOD is being used in modern games. I know many games use some sort of dynamic LOD for terrain, but what about elsewhere? Level of Detail for 3D Graphics for example points out that discrete LOD (where artists prepare several models in advance) is widely used because of the performance overhead of continuous LOD. That book was published in 2002 however, and I'm wondering if things are different now. There has been some research in performing dynamic LOD using the geometry shader (this paper for example, with its implementation in ShaderX6), would that be used in a modern game? To summarize, my question is about the state of LOD in modern video games, what algorithms are used and why? In particular, is view dependent continuous simplification used or does the runtime overhead make using discrete models with proper blending and impostors a more attractive solution? If discrete models are used, is an algorithm used (e.g. vertex clustering) to generate them offline, do artists manually create the models, or perhaps a combination of both methods is used?

    Read the article

  • How can I improve the "smoothness" of a 2D side-scrolling iPhone game?

    - by MrDatabase
    I'm working on a relatively simple 2D side-scrolling iPhone game. The controls are tilt-based. I use OpenGL ES 1.1 for the graphics. The game state is updated at a rate of 30 Hz... And the drawing is updated at a rate of 30 fps (via NSTimer). The smoothness of the drawing is ok... But not quite as smooth as a game like iFighter. What can I do to improve the smoothness of the game? Here are the potential issues I've briefly considered: I'm varying the opacity of up to 15 "small" (20x20 pixels) textures at a time... Apparently varying the opacity in this manner can degrade drawing performance I'm rendering at only 30 fps (via NSTimer)... Perhaps 2D games like iFighter are rendered at a higher frame rate? Perhaps the game state could be updated at a faster rate? Note the acceleration vales are updated at 100 Hz... So I could potentially update part of the game state at 100 hz All of my textures are PNG24... Perhaps PNG8 would help (due to smaller size etc)

    Read the article

  • Ubuntu won't load, freezes on purple screen

    - by kara
    Last time I restarted my computer, I could not get Ubuntu to load; the screen would either go black, or would hang at the purple screen indefinitely. I have had some graphics problems in the past, but had put 'nomodeset' after 'quiet splash' in the grub command line, which at least let Ubuntu load. That doesn't work now, and doesn't work if I remove it. I looked up some answers, such as this one: Purple start screen - no splash screen However, when I enter the root in recovery mode in grub, I always get errors when I run those command lines and it won't let me modify the files. Also, if I run in recovery mode and then choose 'resume normal boot', it will continue. But instead of getting a usual interface, I get a black screen that asks for my username and password. I enter these, and it tells me I'm in Ubuntu 12.04, but I'm still on a black screen with texts. It also informs me that there are updates to install. When I use the command 'sudo apt-get update', it starts to retrieve the information, but then the screen goes blank after a couple of seconds and I can't do anything anymore. Any ideas?

    Read the article

  • Creating a 2D perspective in 3D game

    - by Accatyyc
    I'm new to XNA and 3D game development in general. I'm creating a puzzle game kind of similar to tetris, built with blocks. I decided to build the game in 3D since I can do some cool animations and transitions when using 3D blocks with physics etc. However, I really do want the game to look "2D". My blocks are made up of 3D models, but I don't want that to be visible when they're not animating. I have followed some XNA tutorials and set up my scene like this: this.view = Matrix.CreateLookAt(cameraPosition, Vector3.Zero, Vector3.Up); this.aspectRatio = graphics.GraphicsDevice.Viewport.AspectRatio; this.projection = Matrix.CreatePerspectiveFieldOfView( MathHelper.ToRadians(45.0f), aspectRatio, 1.0f, 10000.0f); ... and it gives me a very 3D-ish look. For example, the blocks in the center of the screen looks exactly how I want them, but closer to the edges of the screen I can see the rotation and sides of them. My guess is that I'm not after a perspective field of view, but any help on which field of view/settings to use to get a "flat" look when the blocks aren't rotated would be great!

    Read the article

  • Rendering a DOM across multiple displays

    - by meetamit
    I'm building a data-driven animation with HTML and javascript to run in a web browser. I would like to display it tiled across three 1080p monitors. This essentially yields a viewport that's 5760px wide and 1080px tall. Pretty large. Does anyone have experience setting up something like this? I have many questions below, but any tip would be appreciated: Is it reasonable to expect a DOM to render into such a large viewport size at close to 60fps? I might choose to use canvas, instead of SVG or HTML, but that would yield a giant canvas. Can a canvas with such high resolution be performant? Of course everything depends on the complexity of the graphics I want to render, but I'm looking to remove that factor from this question, so assume I'm asking about a canvas animation that can run at 60fps at 1920x1080 resolution. Would it run roughly as fast at 3 times the width? Would three.js and WebGL be a more proper approach at that resolution? How do you actually cause Chrome or FF to span 3 monitors at full screen? Do I need a 3rd party solution of any kind? Thanks!

    Read the article

  • How can I get my monitor's maximum resolution without the proprietary AMD graphic driver installed?

    - by Venki
    I am using Ubuntu 14.04. I have an AMD Radeon 5570 HD graphic card. Actually, the default open source REDWOOD drivers aren't allowing me to choose my monitor's maximum screen resolution(which is 1366 x 768). I just have two resolutions displayed which are 1024x768 and 800x600 . If I give the command : xrandr -s 1366x768 then the output is: Size 1366x768 not found in available modes So just for the sake of getting 1366x768 resolution I am forced to install the proprietary graphic driver that AMD gives me from its site. But if I install it(which itself is quite a problem-prone process), I undergo a lot of 'inconvenience'. Sometimes after an OS update, the driver crashes unity. Then I will have to uninstall that driver from a tty and google around for a solution. Also I encounter screen tearing problems occasionally. In addition I also cant see my login screen(See this question which states this particular problem). The main problem is AMD does not update its driver as quick as Ubuntu updates its OS. This is quite irritating. So, I want the maximum resolution(and performance) that my graphics card and monitor can give me without installing the 'problematic' proprietary graphic card driver that AMD gives. Is this possible? Suggestions please. Thanks in advance. PS :- More system specs details:- Intel i3 2100 processor AMD P8H61-M PLUS2 motherboard AMD Radeon 5570 HD graphic card DELL monitor (BTW, Thank you for reading through my elaborate description!)

    Read the article

  • How to disable discrete GPU using NVIDIA drivers?

    - by penzoiders
    I have a DELL studio XPS 13 (aka 1340) as of 12.04 most things run smoothly out of the box, but I have some power draining and warmness issues (if not to be called terrible heat issues) The system came with a NVIDIA GeForce 9500M (which has Hybrid SLI) and it shows up in "lspci" as these 2 cards 02:00.0 VGA compatible controller: NVIDIA Corporation G98 [GeForce 9200M GS] (rev a1) 03:00.0 VGA compatible controller: NVIDIA Corporation C79 [GeForce 9400M G] (rev b1) I had to install nvidia-current over noveau driver 'cause noveau does freeze the system after suspension. By installing nvidia-current and running nvidia-xconfig the resume process after suspension is fixed. By the way both with nvidia-current and noveau the system drains a lot of battery and heats up a lot. I suppose this is because the discrete GPU is always on. I don't really need 3D graphics on this system, if not the minimal to run unity and compiz for window management. So my question is: How do I disable, using nvidia-current, the discrete GPU 9200M and use only the integrated one 9400M? notes: In BIOS I have no option to disable discrete GPU This I think is not applicable because of the suspension-freeze issue (with noveau): https://help.ubuntu.com/community/HybridGraphics I've found this but I don't know which --sli option I should choose to fit my needs: http://manpages.ubuntu.com/manpages/hardy/man1/nvidia-xconfig.1.html My system has not optimus or cuda, but anyone can tell me if bumblebee can work for me?

    Read the article

  • What is wrong with my speculair phong shading

    - by Thijser
    I'm sorry if this should be placed on stackoverflow instead however seeing as this is graphics related I was hoping you guys could help me: I'm attempting to write a phong shader and currently working on the specular. I came acros the following formula: base*pow(dot(V,R),shininess) and attempted to implement it (V is the posion of the viewer and R the reflective vector). This gave the following result and code: Vec3Df phongSpecular(const Vec3Df & vertexPos, Vec3Df & normal, const Vec3Df & lightPos, const Vec3Df & cameraPos, unsigned int index) { Vec3Df relativeLightPos=(lightPos-vertexPos); relativeLightPos.normalize(); Vec3Df relativeCameraPos= (cameraPos-vertexPos); relativeCameraPos.normalize(); int DotOfNormalAndLight = Vec3Df::dotProduct(normal,relativeLightPos); Vec3Df reflective =(relativeLightPos-(2*DotOfNormalAndLight*normal))*-1; reflective.normalize(); float phongyness= Vec3Df::dotProduct(reflective,relativeCameraPos); if (phongyness<0){ phongyness=0; } float shininess= Shininess[index]; float speculair = powf(phongyness,shininess); return Ks[index]*speculair; } I'm looking for something more like this:

    Read the article

  • Scale an image with unscalable parts

    - by Uko
    Brief description of problem: imagine having some vector picture(s) and text annotations on the sides outside of the picture(s). Now the task is to scale the whole composition while preserving the aspect ratio in order to fit some view-port. The tricky part is that the text is not scalable only the picture(s). The distance between text and the image is still relative to the whole image, but the text size is always a constant. Example: let's assume that our total composition is two times larger than a view-port. Then we can just scale it by 1/2. But because the text parts are a fixed font size, they will become larger than we expect and won't fit in the view-port. One option I can think of is an iterative process where we repeatedly scale our composition until the delta between it and the view-port satisfies some precision. But this algorithm is quite costly as it involves working with the graphics and the image may be composed of a lot of components which will lead to a lot of matrix computations. What's more, this solution seems to be hard to debug, extend, etc. Are there any other approaches to solving this scaling problem?

    Read the article

  • Which is a better graphics card

    - by michael
    Hi, Can someone please give me advice regarding which of the following is a better graphics cards? Radeon HD4650 1GB NVIDIA Geforce GT240M 1GB Or which brand is better in general? Radeon? or Nvidia? Thank you.

    Read the article

  • Integrated Windows authentication in IIS causing ADO.NET failure

    - by TrueWill
    We have a .NET 3.5 Web Service running under IIS. It must use identity impersonate="true" and Integrated Windows authentication in order to authenticate to third-party software. In addition, it connects to a SQL Server database using ADO.NET and SQL Server Authentication (specifying a fixed User ID and Password in the connection string). Everything worked fine until the database was moved to another SQL Server. Then the Web Service would throw the following exception: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server) This error only occurs if identity impersonate is true in the Web.config. Again, the connection string hasn't changed and it specifies the user. I have tested the connection string and it works, both under the impersonated account and under the service account (and from both the remote machine and the server). What needs to be changed to get this to work with impersonation?

    Read the article

  • do not allow integrated windows authentication *for one of the domains*

    - by MK
    We have an ASP.NET web application which uses integrated windows authentication. It is accessed by users from two domains, A and B. A is the primary domain and B is an older domain which is going away. Web application is authenticating users using a group policy which only exists in domain A. Every user in domain B has an account in domain A. The application lives in domain A. There was no trust between the domains. So users from domain A would get silently authenticated and logged into the site. Users from domain B didn't get authenticated automatically and were prompted with the IE popup, to which they authenticated using their domain A credentials and everything worked. Now somebody has set up a trust between the domains and users from domain B get authenticated silently to IIS, and then their login fails (no group policy). So the question is: can I either programmatically or in IIS configuration make it so that users from domain B still get prompted even though there is trust between the domains? Is there a way to tell the server where IIS is running to ignore the trust relationship maybe?

    Read the article

  • AS3 Embed Image Instead of drawing graphics

    - by David
    I've got a custom AS3 scrollbar that I need to modify. The scroll thumb (the draggable part of a scrollbar) is currently just done by drawing a rectangle. But I need it to look more like a real scrollbar. Not sure how to modify the below code to import/use a scroll thumb image: scrollThumb = new Sprite(); scrollThumb.graphics.lineStyle(); scrollThumb.graphics.beginFill(0x0066ff); scrollThumb.graphics.drawRect(0, 0, 1, 1); addChild(scrollThumb); I know that I would embed an image by doing something like this: [Embed(source="images/image1.png")] private static var Image1Class:Class; But then how do I set the scrollThumb = to the image? Thanks!

    Read the article

  • How can let Qt Graphics View Framework support custom layers

    - by jnblue
    Qt's graphics view frameworks is very powerful, but I have not found a way to support custom layers. In Qt, there is a QGraphicsScene::ItemLayer,but QGraphicsScene renders all items are in this layer. I want manage the items with several layers, Just like Illustrator and CorelDraw. all the item only in the current layer will receive the event, be selected or get the key focus etc.. Other layers(not current layer) will not receive all scene event. The most reasons of using layers is I could catalogue a large number of items more clearly.And without needing transfer events to all the layers' items ,I think the graphics frameworks will be more efficient. The last question, does QGraphicsView support rendering server stacked graphics scenes at the same time? If support, I think the "custom layers" can be solved in this way. Thanks very much!

    Read the article

  • Playing with Graphics in Flex

    - by Anoop
    Hi All, I was just going through one code used to draw one chart. This code is written in the updateDisplayList of the itemrenderer of column chart. I am not good at the graphics part of flex. Can anybody please explain me what this code is doing. I can see the final output, but am not sure how is this achieved. var rc:Rectangle = new Rectangle(0, 0, width , height ); var g:Graphics = graphics; g.clear(); g.moveTo(rc.left,rc.top); g.beginFill(fill); g.lineTo(rc.right,rc.top); g.lineTo(rc.right,rc.bottom); g.lineTo(rc.left,rc.bottom); g.lineTo(rc.left,rc.top); g.endFill(); Regards, PK

    Read the article

  • as3 this.graphics calls do nothing

    - by zzz
    class A: [SWF(width='800',height='600',frameRate='24')] public class A extends MovieClip { private var b:B; public function A(){ super(); b = new B(); addChild(b); addEventListener(Event.ENTER_FRAME, update); } private function update(e:Event):void { b.draw(); } } class B: public class B extends MovieClip { public function draw():void { //! following code works well if put in constructor, but not here this.graphics.beginFill(0xff0000); this.graphics.drawCircle(200,200,50); } } this.graphics calls do nothing in draw method, but work fine inside B`s constructor, what i am doing wrong ?

    Read the article

  • Why is my second monitor not working?

    - by StampedeXV
    Since I have my new computer, I have a very weird problem. Facts: New Computer: Motherboard: ASRock Z77 Pro 3 Graphics-card: Asus1GB D5 X EN GTX560 DCII OC/2DI R CPU: Intel i5-3570 Windows 7 64bit 500W beQuiet special edition (92% efficiency) 8GB 1333MHz DDR3 Corsair RAM (CL9) Scythe Mugen 2 2 magnetic HDDs + 1 SDD 1 DVD-R Old Computer: Motherboard: Asus P55 something Graphics-card: Asus1GB D5 X EN GTX560 DCII OC/2DI R CPU: Intel i7-870 Windows 7 64bit 550W Corsair 8GB 1333MHz DDR3 Corsair RAM (CL9) Scythe Mugen 3 2 magnetic HDDs + 1 SDD 1 DVD-R On the old computer it worked fine with two monitors. Moving to the new (I took the same Graphics-card) it only works with one. The weird thing I mentioned is: not matter which one. But if I put both there, only one is available. There is no reaction at the start (where normally (at least if I remember correctly) the monitor shortly went from "standby" to "on"). Windows does not recognize a second monitor in the Device Manager. I have the latest drivers for Motherboard and Graphics-card. I have the latest BIOS drivers. I am out of ideas. Edit: completed computer setup

    Read the article

  • If I run two monitors from two different graphic cards, can I still have Twinview?

    - by rumtscho
    I am planning to get a second 2560x1440 monitor for home. The trouble is, I only have 1xDVI, 1xVGA on my graphics card (a 250 GT). I don't want to buy a new graphics card until the prices for the 500 series have stabilized, so probably not before summer (or will it happen earlier? I don't remember how it was for other series, and I couldn't find long-term price history for video cards). The solution I had in mind is to get the 7600 GS from my old PC, which also has 1xDVI, 1xVGA, and run each monitor on a separate card. I have never done that, and I was wondering 1. If I will be able to run the monitors in Twinview then, or will I be stuck with separate X sessions, and 2. Whether there are some other disadvantages as compared to a single-head graphics card. (I am using the proprietary driver because I need compiz). As an aside, how do I find out whether the DVI port on the old graphics card is dual link?

    Read the article

  • Unity desktop "smears" (doesn't refresh) and shows no wallpaper

    - by Cedric Reichenbach
    Since a couple of days now, my unity desktop background smears everything, just like what old Windows versions were famous for: Of course, I tried rebooting a couple of times. Also, I switched graphics driver and I tried to change wallpaper and theme, but none of them solved the problem. What could be causing that problem, and where can I search on for its source? Infomation update I'm using Ubuntu 13.04 (not updated to 13.10 yet). The following command were all run from cinnamon (on the same Ubuntu installation). sudo lsb_release -a: No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 13.04 Release: 13.04 Codename: raring sudo uname -a: Linux cedric-MacBookPro 3.8.0-32-generic #47-Ubuntu SMP Tue Oct 1 22:35:23 UTC 2013 x86_64 x86_64 x86_64 GNU/Linux sudo dpkg -l | grep xserver-xorg-video: ii xserver-xorg-video-all 1:7.7+1ubuntu4 amd64 X.Org X server -- output driver metapackage ii xserver-xorg-video-ati 1:7.1.0-0ubuntu2 amd64 X.Org X server -- AMD/ATI display driver wrapper ii xserver-xorg-video-cirrus 1:1.5.2-0ubuntu1 amd64 X.Org X server -- Cirrus display driver ii xserver-xorg-video-fbdev 1:0.4.3-0ubuntu1 amd64 X.Org X server -- fbdev display driver ii xserver-xorg-video-intel 2:2.21.6-0ubuntu4.3 amd64 X.Org X server -- Intel i8xx, i9xx display driver ii xserver-xorg-video-mach64 6.9.3-0ubuntu1 amd64 X.Org X server -- ATI Mach64 display driver ii xserver-xorg-video-mga 1:1.6.2-0ubuntu1 amd64 X.Org X server -- MGA display driver ii xserver-xorg-video-modesetting 0.7.0-0ubuntu2 amd64 X.Org X server -- Generic modesetting driver ii xserver-xorg-video-neomagic 1:1.2.7-0ubuntu1 amd64 X.Org X server -- Neomagic display driver ii xserver-xorg-video-nouveau 1:1.0.7-0ubuntu1 amd64 X.Org X server -- Nouveau display driver ii xserver-xorg-video-openchrome 1:0.3.1-0ubuntu1.13.04.1 amd64 X.Org X server -- VIA display driver ii xserver-xorg-video-qxl 0.1.0-0ubuntu3 amd64 X.Org X server -- QXL display driver ii xserver-xorg-video-r128 6.9.1-0ubuntu1 amd64 X.Org X server -- ATI r128 display driver ii xserver-xorg-video-radeon 1:7.1.0-0ubuntu2 amd64 X.Org X server -- AMD/ATI Radeon display driver ii xserver-xorg-video-s3 1:0.6.5-0ubuntu3 amd64 X.Org X server -- legacy S3 display driver ii xserver-xorg-video-savage 1:2.3.6-0ubuntu1 amd64 X.Org X server -- Savage display driver ii xserver-xorg-video-siliconmotion 1:1.7.7-0ubuntu1 amd64 X.Org X server -- SiliconMotion display driver ii xserver-xorg-video-sis 1:0.10.7-0ubuntu1 amd64 X.Org X server -- SiS display driver ii xserver-xorg-video-sisusb 1:0.9.6-0ubuntu1 amd64 X.Org X server -- SiS USB display driver ii xserver-xorg-video-tdfx 1:1.4.5-0ubuntu1 amd64 X.Org X server -- tdfx display driver ii xserver-xorg-video-trident 1:1.3.6-0ubuntu2 amd64 X.Org X server -- Trident display driver ii xserver-xorg-video-vesa 1:2.3.2-0ubuntu1 amd64 X.Org X server -- VESA display driver ii xserver-xorg-video-vmware 1:12.0.2+git.e5ac80d8-0ubuntu1 amd64 X.Org X server -- VMware display driver sudo lspci | grep VGA: 01:00.0 VGA compatible controller: NVIDIA Corporation GT216M [GeForce GT 330M] (rev a2)

    Read the article

  • GPU hung when switching graphic card

    - by Lie Ryan
    I have a laptop (Dell Inspiron N4110) with a switchable graphic. $ lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: ATI Technologies Inc NI Whistler [AMD Radeon HD 6600M Series] (rev ff) Normally, my laptop starts with both graphic cards enabled, which caused the laptop to turn very hot and the fan to become very noisy. I have been using a small script to disable the Radeon card. For some time, I'm quite happy with this arrangement. However, I have been having some issues with the Intel card (IGD), the Intel card often randomly hang when running OpenGL apps; and so I want to give the Radeon card (DIS) another chance. I have never been able to switch to the Radeon card, but recently, I found out that if I do a "delayed switching" (DDIS): # echo "DDIS" > /sys/kernel/debug/vgaswitcheroo/switch root@lieryan-dell-ubuntu:/sys/kernel/debug/vgaswitcheroo# cat switch 0:IGD:+:Pwr:0000:00:02.0 1:DIS: :Pwr:0000:01:00.0 then I logoff (i.e. to restart X), the screen switch to pseudo-tty and then it stuck there freezing. At this situation, mouse and keyboard stops working so I can't switch to another ptty. I tried ssh-ing from another computer to salvage logs (dmesg at that point) and whatnot; I found out that when freezing, the active graphic card is the AMD card: -- this is from ssh -- # cat switch 0:IGD: :Off:0000:00:02.0 1:DIS:+:Pwr:0000:01:00.0 but the GPU is apparently hung, looking at dmesg gives: ... [ 1411.649974] vga_switcheroo: client 0 refused switch [ 1411.649985] vga_switcheroo: setting delayed switch to client 1 [ 1423.911759] vga_switcheroo: processing delayed switch to 1 [ 1424.006564] fbcon: Remapping primary device, fb1, to tty 1-63 [ 1424.006799] i915: switched off [ 1424.840351] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1425.718088] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1426.622377] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1427.355683] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1428.193549] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id ... the invalid framebuffer id error is repeated for many times over ... I were able to successfully recover by switching back to the Intel card and restarting X from ssh; indicating that only the Radeon card has problems switching. System info: $ uname -a Linux lieryan-dell-ubuntu 3.0.0-14-generic #23-Ubuntu SMP Mon Nov 21 20:28:43 UTC 2011 x86_64 x86_64 x86_64 GNU/Linux $ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 11.10 Release: 11.10 Codename: oneiric The laptop also do not have the option to set graphic card at BIOS and the proprietary driver, fglrx, also have never worked; when I installed it through jockey ("Additional Drivers"), glxinfo showed that it still being rendered by Mesa, the /sys/kernel/debug/vgaswitcheroo directory has gone missing, and the driver crashes with a traceback if I use xorg.conf to tell X to use fglrx. Anyone had any idea if it is possible to use this AMD card either with the radeon or the fglrx driver? logs: dmesg

    Read the article

  • Cannot get 3D OpenGL support in Vmware guests, how can I fix this?

    - by jjapol
    I have been working at this problem for 2 days now. I cannot for the life of me enable 3D support in VMWare 9 guests. My specifications are: Hardware: Dell Latitude E5520 laptop. Processor: Intel i7-2620M CPU @ 2.70GHz × 4. Memory: 8GB. Video: Intel Sandybridge Mobile x86/MMX/SSE2 OS: Ubuntu 12.04.1 LTS, 32 bit. Vmware Workstation: 9.0.1 build-894247 Glxgears functions fine. Frame rate is ~60fps. Vmware guest: Windows 7 Starting the Windows 7 guest in VMware throws the following errors: No 3D support is available from the host. and Hardware graphics acceleration is not available. I've read through this VMware forum thread, but again the hardware in the post is different (nVidia). I've followed the instructions at this Ask Ubuntu post as closely as possible as the question is nearly the same as mine although my hardware is different. Answer 1 regarding setting mks.gl.allowBlacklistedDrivers = TRUE; in my vmx configuration file causes the VM to crash when it starts. The second answer I followed as closely as possible. I uninstalled VMware, Did sudo apt-get install build-essential linux-headers-$(uname -r) at a terminal, Added the PPA https://launchpad.net/~glasen/+archive/intel-driver, Then at a terminal did sudo apt-get update && sudo apt-get upgrade -y I reinstalled VMware and have the same results: no 3D in guests. I'm getting the feeling that something is awry with the Sandy Bridge driver, but I can't seem to come up with any solutions. Has anyone out there run across this problem also? By the way, the operation of the likes of Solidworks and AutoCad within a Windpws 7 guest does appear to be improved in VMware 9 vs VMware 8 in spite of the fact that 3D support is lacking in the Windows 7 guest. I'd also add that my glxinfo file was nearly identical to the glxinfo file posted at askubuntu.com/questions/181829/…. I had a total of seven minor differences per a comparison using Meld. –

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >