Search Results

Search found 8498 results on 340 pages for 'integrated graphics'.

Page 20/340 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • how to enable opengl 2.0 and webgl on gma 3150 ?

    - by mahmoudelbadry
    hi, i have a dell mini 1012 which has an intel n450 processor and gma 3150 integrated graphics card running ubuntu 10.10 according to the intel website the graphics card support opengl 2.0 http://software.intel.com/en-us/articles/quick-reference-guide-to-intel-integrated-graphics/#9 but when i type glxinfo in terminal the opengl version string gives me the following OpenGL version string: 1.4 Mesa 7.9-devel i installed the latest drivers but it didn't work. so, how can i enable opengl 2.0 on this card?? thanks

    Read the article

  • Graphic issue on intel945 chipset

    - by peeyush tiwari
    I used Intrepid(8.10) and my graphics used to work fine with compiz effects and all(on better resolution than 1024x768).Now I have upgraded to Precise(12.04)but I use gnome classic(with compiz effects)as my desktop but the compiz effects seem not to work, only unity 2D works. When I ran lshw -c video it gives: *-display UNCLAIMED description: VGA compatible controller product: 82945G/GZ Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 02 width: 32 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list configuration: latency=0 resources: memory:fea80000-feafffff ioport:dc00(size=8) memory:e0000000-efffffff memory:fea40000-fea7ffff Sysinfo shows: Display Resolution 1024x768 pixels OpenGL Renderer Gallium 0.4 on llvmpipe(LLVM 0x300) X11 Vendor The X.Org Foundation On SystemSettings: Memory 993.3 MiB Processor Intel® Pentium(R) 4 CPU 2.40GHz Graphics VESA: Intel(r) 82945G Chipset Family Graphics OS type 32-bit glxgears output comes to be around 100fps which used to be around 900fps in Intrepid

    Read the article

  • Dual monitor setup with Intel graphics and Nvidia Geforce GT 425M on 12.04

    - by fo_x86
    I have a Dell XPS L401x and just installed 12.04. I have a mini-display and an HDMI output, and I could hook up two Dell UltraSharp monitors to each port on Windows 7, and get a triple monitor setup. Doing a bit of research, it seems like the mini-display is wired to the integrated graphics card whereas HDMI is hooked up to Nvidia graphics card. I've also installed Bumblebee as it seemed like that was the proper way of installing Nvidia drivers on Ubuntu. At the moment none of my monitors is being recognized by Ubuntu. Is it even possible to have a triple monitor (laptop display + 2 external) setup? Has anyone successfully done this?

    Read the article

  • Trouble with 12.10 lag

    - by Brennan
    Well basicly lately I have been having lag problems with 12.10. I will post my specs, but before the update to 12.10, it said that I had intel graphics. Now it says I have Gallium. My specs: *Memory: 3.9 GiB *Processor: Pentium(R) Dual-Core CPU E5500 @ 2.80GHz × 2 *Graphics: Gallium 0.4 on llvmpipe (LLVM 3.2, 128 bits) (used to say intel graphics) *OS Type: 32-bit *Disk: 486.1 GB The output of the command sudo apt-cache check is this: E: Invalid operation check The output of the command sudo lspci -nnk | grep -A5 VGA is this: 00:02.0 VGA compatible controller [0300]: Intel Corporation 4 Series Chipset Integrated Graphics Controller [8086:2e32] (rev 03) Subsystem: ASUSTeK Computer Inc. Device [1043:836d] Kernel driver in use: i915 00:1b.0 Audio device [0403]: Intel Corporation NM10/ICH7 Family High Definition Audio Controller [8086:27d8] (rev 01) Subsystem: ASUSTeK Computer Inc. Device [1043:8445] Kernel driver in use: snd_hda_intel

    Read the article

  • Desktop goes blurry / granulated

    - by Bonfire
    After few minutes using, desktop and fonts in unity-menu goes blurry. See screenshot.! In all applications (browser etc.) graphics seem to be allright. After rebooting problem disappears for few minutes. Changing the wallpaper also makes the desktop clear (for a while) but fonts are still blurry. I have ubuntu 12.04 and windows xp dualbooted. lspci: 00:00.0 Host bridge: Intel Corporation 82945G/GZ/P/PL Memory Controller Hub (rev 02) 00:02.0 VGA compatible controller: Intel Corporation 82945G/GZ Integrated Graphics Controller (rev 02) 00:02.1 Display controller: Intel Corporation 82945G/GZ Integrated Graphics Controller (rev 02) lshw -c video: *-display:0 description: VGA compatible controller product: 82945G/GZ Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 02 width: 32 bits clock: 33MHz capabilities: vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:16 memory:d0100000-d017ffff ioport:30c0(size=8) memory:c0000000-cfffffff memory:d0180000-d01bffff *-display:1 UNCLAIMED description: Display controller product: 82945G/GZ Integrated Graphics Controller vendor: Intel Corporation physical id: 2.1 bus info: pci@0000:00:02.1 version: 02 width: 32 bits clock: 33MHz capabilities: cap_list configuration: latency=0 resources: memory:40000000-4007ffff Have you got any ideas to get this solved? Thank you very much! Bonfire

    Read the article

  • Can emsripten compile down to Canvas-based Js instead of WebGL?

    - by Sebastian Scholle
    I understand that emscripten compiles down LLVM to JS and it converts OpenGL Calls to WebGL. Thats a fairly simple translation. Is there a way to tell emscripten to use some other graphics Library ( for example Pixi JS ) for its rendering code translations? Is the compiled JS code easy to update or would it be better to merge in your own Graphics API that handles WebGL/Canvas calls. IE: can we use a C++ Graphics Wrapper Library that when compiles to JS, will simply plug into our own JS Graphics Wrapper Library? Im assuming YES, but has anyone tried this? And if So, what would be your technique, as my C++ skills are basic.

    Read the article

  • How do I disable a Nvidia 9600GT on MacBookPro 5.1?

    - by Gjan
    i finally put up a Dual Boot with Ubuntu and Lion on my old MacBookPro 5.1 As reported in many cases the discrete graphics card is turned on all the time consuming a lot of power and thus heating up the laptop. Since the discrete graphics card does not support the nvidia optimus technology, the corresponding packge nvidia-prime does not help in this case. Therefore my question is, how to manually disable the discrete graphics card Nvdidia 9600GT ? Preferably a 'switch-on-the-run' version, but a 'set-on-boot' would be totally fine!

    Read the article

  • Unable to install cedarview-graphics-drivers

    - by antnchv
    $ sudo apt-get install cedarview-graphics-drivers Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: cedarview-graphics-drivers : Depends: xserver-xorg-core (= 2:1.10.99.901) Depends: xorg-video-abi-11 E: Unable to correct problems, you have held broken packages. Please advice.

    Read the article

  • IIS7 Integrated Pipeline - Response.End not ending the request.

    - by MikeGurtzweiler
    I have the following bit of code that worked as expected before we upgraded to Integrated Pipeline in IIS7. public void RedirectPermanently(string url, bool clearCookies) { Response.ClearContent(); Response.StatusCode = 301; Response.AppendHeader("Location", url); if(clearCookies) { Response.Cookies.Clear(); Response.Flush(); Response.End(); } } Previously when this method was executed, if clearCookies was true, the response would be sent to the client and request processing would end. Now under Integrated Pipeline Response.End() does not seem to end processing. The page continues running as if the method was never called. Big question is, why and what changed! Thanks.

    Read the article

  • Java Graphics not displaying on successive function calls, why?

    - by primehunter326
    Hi, I'm making a visualization for a BST implementation (I posted another question about it the other day). I've created a GUI which displays the viewing area and buttons. I've added code to the BST implementation to recursively traverse the tree, the function takes in coordinates along with the Graphics object which are initially passed in by the main GUI class. My idea was that I'd just have this function re-draw the tree after every update (add, delete, etc...), drawing a rectangle over everything first to "refresh" the viewing area. This also means I could alter the BST implementation (i.e by adding a balance operation) and it wouldn't affect the visualization. The issue I'm having is that the draw function only works the first time it is called, after that it doesn't display anything. I guess I don't fully understand how the Graphics object works since it doesn't behave the way I'd expect it to when getting passed/called from different functions. I know the getGraphics function has something to do with it. Relevant code: private void draw(){ Graphics g = vPanel.getGraphics(); tree.drawTree(g,ORIGIN,ORIGIN); } vPanel is what I'm drawing on private void drawTree(Graphics g, BinaryNode<AnyType> n, int x, int y){ if( n != null ){ drawTree(g, n.left, x-10,y+10 ); if(n.selected){ g.setColor(Color.blue); } else{ g.setColor(Color.gray); } g.fillOval(x,y,20,20); g.setColor(Color.black); g.drawString(n.element.toString(),x,y); drawTree(g,n.right, x+10,y+10); } } It is passed the root node when it is called by the public function. Do I have to have: Graphics g = vPanel.getGraphics(); ...within the drawTree function? This doesn't make sense!! Thanks for your help.

    Read the article

  • Drawing line graphics leads Flash to spiral out of control!

    - by drpepper
    Hi, I'm having problems with some AS3 code that simply draws on a Sprite's Graphics object. The drawing happens as part of a larger procedure called on every ENTER_FRAME event of the stage. Flash neither crashes nor returns an error. Instead, it starts running at 100% CPU and grabs all the memory that it can, until I kill the process manually or my computer buckles under the pressure when it gets up to around 2-3 GB. This will happen at a random time, and without any noticiple slowdown beforehand. WTF? Has anyone seen anything like this? PS: I used to do the drawing within a MOUSE_MOVE event handler, which brought this problem on even faster. PPS: I'm developing on Linux, but reproduced the same problem on Windows. UPDATE: You asked for some code, so here we are. The drawing function looks like this: public static function drawDashedLine(i_graphics : Graphics, i_from : Point, i_to : Point, i_on : Number, i_off : Number) : void { const vecLength : Number = Point.distance(i_from, i_to); i_graphics.moveTo(i_from.x, i_from.y); var dist : Number = 0; var lineIsOn : Boolean = true; while(dist < vecLength) { dist = Math.min(vecLength, dist + (lineIsOn ? i_on : i_off)); const p : Point = Point.interpolate(i_from, i_to, 1 - dist / vecLength); if(lineIsOn) i_graphics.lineTo(p.x, p.y); else i_graphics.moveTo(p.x, p.y); lineIsOn = !lineIsOn; } } and is called like this (m_graphicsLayer is a Sprite): m_graphicsLayer.graphics.clear(); if (m_destinationPoint) { m_graphicsLayer.graphics.lineStyle(2, m_fixedAim ? 0xff0000 : 0x333333, 1); drawDashedLine(m_graphicsLayer.graphics, m_initialPos, m_destinationPoint, 10, 10); }

    Read the article

  • Integrated video card not detected after installing new video card

    - by Jaime
    Hi, I got this emachines ET1331G-03W with a built in integrated video card (GeForce 6150SE). I added another video card (e-GeForece 6200 LE) in the hopes that I utilize both and have a dual monitor. But once the additional card was installed, windows 7 does not recognize the integrated video. Is it possible to use both video cards and have a dual monitor setup? Is there a switch on the motherboard that I have to turn on or off to enable the integrated video card along with the new card. Thanks!

    Read the article

  • 3 Monitor PCI-e Graphics card on Linux (without tremendous pain)?

    - by N Rahl
    As we are all painfully aware, the only way to get multiple monitors AND compositing (Compiz) on Linux is to use a single graphics card that can drive both (or in my case all three) screens. I bought a Radeon 5750 specifically because it claims to able to drive 3 monitors. I can plug in 3 monitors (2 DVI, 1 HDMI) and the Catalyst Control Center shows all 3, but only 2 can be enabled at a time. The exact message is: The current settings cannot be applied. Possible issues may include: - Display(s) cannot be enabled. - Setting(s) cannot be applied due to insufficient video memory. So I'm going to assume that either the 5750 doesn't support 3 monitors, OR, more likely, ATI couldn't be bothered to add that support to their Linux drivers. So this is a multipart question: First, can anyone suggest a PCI Express Graphics card that can run 3 screens on linux without tremendous pain? I'm looking for something where you install the driver and all three screens "just work". Does such a card exist? Second, if you have a 5750, have you been able to get it to do 3 monitors? I'm running Ubuntu 10.04 at the moment.

    Read the article

  • Linux: 3 Monitor PCI-e Graphics card (without tremendous pain)?

    - by N Rahl
    As we are all painfully aware, the only way to get multiple monitors AND compositing (Compiz) on Linux is to use a single graphics card that can drive both (or in my case all three) screens. I bought a Radeon 5750 specifically because it claims to able to drive 3 monitors. I can plug in 3 monitors (2 DVI, 1 HDMI) and the Catalyst Control Center shows all 3, but only 2 can be enabled at a time. I'll post the exact error message here soon, but it's very useless. So I'm going to assume that either the 5750 doesn't support 3 monitors, OR, more likely, ATI couldn't be bothered to add that support to their Linux drivers. So this is a multipart question: First, can anyone suggest a PCI Express Graphics card that can run 3 screens on linux without tremendous pain? I'm looking for something where you install the driver and all three screens "just work". Does such a card exist? Second, if you have a 5750, have you been able to get it to do 3 monitors? I'm running Ubuntu 10.04 at the moment. Thanks, Nick

    Read the article

  • 3 Monitor PCI-e Graphics card (without tremendous pain)?

    - by N Rahl
    As we are all painfully aware, the only way to get multiple monitors AND compositing (Compiz) on Linux is to use a single graphics card that can drive both (or in my case all three) screens. I bought a Radeon 5750 specifically because it claims to able to drive 3 monitors. I can plug in 3 monitors (2 DVI, 1 HDMI) and the Catalyst Control Center shows all 3, but only 2 can be enabled at a time. The exact message is: The current settings cannot be applied. Possible issues may include: - Display(s) cannot be enabled. - Setting(s) cannot be applied due to insufficient video memory. So I'm going to assume that either the 5750 doesn't support 3 monitors, OR, more likely, ATI couldn't be bothered to add that support to their Linux drivers. So this is a multipart question: First, can anyone suggest a PCI Express Graphics card that can run 3 screens on linux without tremendous pain? I'm looking for something where you install the driver and all three screens "just work". Does such a card exist? Second, if you have a 5750, have you been able to get it to do 3 monitors? I'm running Ubuntu 10.04 at the moment.

    Read the article

  • Event taps: Varying results with CGEventPost, kCGSessionEventTap, kCGAnnotatedSessionEventTap, CGEve

    - by kevingessner
    I'm running into a thorny problem with posting an event from an event tap. I'm tapping for NSSystemDefined at kCGHIDEventTap, then replacing the event with a new one. The problem I'm running in to is that depending on how I post the event, it's being seen only by some applications. My test applications are Opera, Firefox, Quicksilver, and Xcode. Here are the different techniques I've tried within my event tap callback, with results. I'm expecting an action (the "correct response") from each app; "system beep" means the nothing-is-bound-to-that-key system sound. Create a new event, and return it from the callback. Opera: no response/system beep, Firefox: no response/system beep, Quicksilver: correct response, Xcode: no response/system beep Create a new event, post to kCGSessionEventTap with CGEventPost, return null. Opera: no response/system beep, Firefox: no response/system beep, Quicksilver: correct response, Xcode: no response/system beep Create a new event, post to kCGAnnotatedSessionEventTap with CGEventPost, return null. Opera: correct response, Firefox: correct response, Quicksilver: no response/system beep, Xcode: no response/system beep Create a new event, post with CGEventTapPostEvent, return null. Opera: no response/system beep, Firefox: no response/system beep, Quicksilver: correct response, Xcode: no response/system beep Create a new event, post to kCGSessionEventTap with CGEventPost, and return new event. Opera: no response/system beep, Firefox: no response/system beep, Quicksilver: correct response, Xcode: no response/system beep Create a new event, post to kCGAnnotatedSessionEventTap with CGEventPost, and return new event. Opera: correct response and system beep, Firefox: correct response and system beep, Quicksilver: correct response and system beep, Xcode: no response/double system beep Create a new event, post with CGEventTapPostEvent, and return new event. Opera: no response/system beep, Firefox: no response/system beep, Quicksilver: correct response, Xcode: no response/system beep (6) is the best, but users are complaining about the extra system beep on correct responses, which I'm guessing is coming from the double-posting of the event. I'm not sure of other combinations to try, or where else to look. Can anyone offer any guidance? Is there any way to get the results of both returning the event from my callback and posting to the annotated tap without doing both? Sorry for the lengthy question; I've been doing a lot of experimenting. Thanks in advance Update: this is the code I use to create the event tap: CFMachPortRef eventTap; eventTap = CGEventTapCreate(kCGHIDEventTap, kCGHeadInsertEventTap, 0,CGEventMaskBit(NX_SYSDEFINED) | (1 << kCGEventKeyDown) | (1 << kCGEventKeyUp), myCGEventCallback, (void *)hidEventQueue);

    Read the article

  • Quartz compositions created in Snow Leopard (10.6) doesn't work in Leopard (10.5) despite testing in

    - by adib
    Hi I have a reasonably advanced (many patches and subpatches) quartz composition that was created in Snow Leopard but doesn't run well (many elements are not rendered) in Leopard. The composition tested OK via Quartz Composer's Test in Runtime option and works fine for both Leopard 32-bits and Leopard 64-bits (menu item "File | Test in Runtime | Leopard 32-bits". In an actual Leopard (32-bits) system, a lot of elements are not rendered in the quartz composition. Below are the log file excerpt when the composition is run in QuickTime Player under Leopard: QuickTime Player[134] *** <QCNodeManager | namespace = "com.apple.QuartzComposer" | 335 nodes>: Patch with name "/units to pixels" is missing QuickTime Player[134] *** Message from <QCPatch = 0x06D82880 "(null)">:Cannot create node of class "/units to pixels" and identifier "(null)" QuickTime Player[134] *** Message from <QCPatch = 0x06D7C130 "(null)">:Cannot create node of class "/resize image to target" and identifier "(null)" QuickTime Player[134] *** Message from <QCPatch = 0x06D7C130 "(null)">:Cannot create connection from ["outputValue" @ "Math_1"] to ["Target_Pixels" @ "Patch_2"] The patch "units to pixels" is a system patch in Snow Leopard whereas the patch "resize image to target" is a custom virtual patch located in my home directory. It seems that we can cross out problems in which the composition is referencing a missing virtual patch. I have tested the composition under another user's account and it ran fine which shows that it already embeds the "resize image to target" virtual patch that is located in my home directory. I'm really puzzled why the composition passes the Leopard Runtime test but yet fail to run in an actual Leopard OS? Is there a post-processing step that I need to run to the composition file? Is there any way to make this patch more compatible with Leopard? Thanks in advance.

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >