Search Results

Search found 5806 results on 233 pages for 'graphics'.

Page 100/233 | < Previous Page | 96 97 98 99 100 101 102 103 104 105 106 107  | Next Page >

  • Ubuntu 13.10 and Dell Inspiron 7520 SE (Intel 4000 and ATI radeon HD 7730m)

    - by pjgowtham
    I installed 13.10 and i have problems with my display. Everything else works fine. I cant control my brightness. I thought the graphics driver is the problem. So i went to software updates, switched to another driver (proprietary) and the Ubuntu screen went black and i wasn't able to boot either. I cant use my lap with such low brightness. Then i reinstalled 13.10 again, This time full hdd reset. Same problem occurred. I grabbed radeon hd7730m for Linux x64 from the amd site, installed it, and the screen went black with only the "x" cursor showing on the next boot. What should i need to do to make my brightness work and stabilize graphic cards as mine is switchable graphics type.

    Read the article

  • SAPPHIRE HD 7770 no audio on HDMI TV display

    - by zeroconf
    I have SAPPHIRE HD 7770 and cannot get work audio over HDMI. http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1159&lid=1&pid=1452&leg=0 I use Ubuntu 12.04 LTS 64-bit version with all current updates. I tried at /etc/default/grub: GRUB_CMDLINE_LINUX_DEFAULT="quiet splash radeon.audio=1" ... it didn't help. It's probably I use proprietary driver -this seems to be open source driver. I use the driver, what jockey-gtk (additional drivers) offered me: ATI/AMD PROPRIETARY FGLRX GRAPHICS DRIVER <---- I installed that one ATI/AMD PROPRIETARY FGLRX GRAPHICS DRIVER (post-release update) So - I installed the first one, because installing second version failed. Everything went fine but no sound at TV display by HDMI. Even Gnome sound mixer doesn't show HDMI choice. Using 32" Samsung B530 LCD TV - http://www.lcdbesttv.com/2010/02/samsung-b530-series-lcd-tv/ I have Asus P8Z77-M motherboard - http://www.asus.com/Motherboards/Intel_Socket_1155/P8Z77M/ - there is also HDMI integrated. When I put HDMI cord to that plug, then even Gnome sound mixer showed HDMI audio but it didn't work. I have set from BIOS, that I use that SAPPHIRE HD 7770 from PCIe. My lspci output: 00:00.0 Host bridge: Intel Corporation Ivy Bridge DRAM Controller (rev 09) 00:01.0 PCI bridge: Intel Corporation Ivy Bridge PCI Express Root Port (rev 09) 00:02.0 Display controller: Intel Corporation Ivy Bridge Graphics Controller (rev 09) 00:14.0 USB controller: Intel Corporation Panther Point USB xHCI Host Controller (rev 04) 00:16.0 Communication controller: Intel Corporation Panther Point MEI Controller #1 (rev 04) 00:1a.0 USB controller: Intel Corporation Panther Point USB Enhanced Host Controller #2 (rev 04) 00:1b.0 Audio device: Intel Corporation Panther Point High Definition Audio Controller (rev 04) 00:1c.0 PCI bridge: Intel Corporation Panther Point PCI Express Root Port 1 (rev c4) 00:1c.5 PCI bridge: Intel Corporation Panther Point PCI Express Root Port 6 (rev c4) 00:1c.6 PCI bridge: Intel Corporation 82801 PCI Bridge (rev c4) 00:1d.0 USB controller: Intel Corporation Panther Point USB Enhanced Host Controller #1 (rev 04) 00:1f.0 ISA bridge: Intel Corporation Panther Point LPC Controller (rev 04) 00:1f.2 SATA controller: Intel Corporation Panther Point 6 port SATA Controller [AHCI mode] (rev 04) 00:1f.3 SMBus: Intel Corporation Panther Point SMBus Controller (rev 04) 01:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Device 683d 01:00.1 Audio device: Advanced Micro Devices [AMD] nee ATI Device aab0 03:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168B PCI Express Gigabit Ethernet controller (rev 09) 04:00.0 PCI bridge: ASMedia Technology Inc. ASM1083/1085 PCIe to PCI Bridge (rev 03)

    Read the article

  • How can I set my screen resolution to match my TV?

    - by Scott Severance
    I have a computer in my classroom that's connected to an LG smart TV (that's actually not so smart. I wouldn't recommend buying one.). For the touch interface, the TV wants a resolution of 1920x1080 at 60Hz. However, I can't seem to set the computer to that resolution. The display settings only offer 1024x768 and 640x480. The computer dual boots with Windows XP, where widescreen options are available in approximately the required size, but the exact resolution -- or even aspect ratio-- isn't available in XP either. I tried the following command: xrandr -s 1920x1080 -r 60 The response was: Size 1920x1080 not found in available modes Back in the old days, the solution would be to edit xorg.conf. However, since that file no longer exists, and I haven't found up-to-date info, I don't know what else to do. If it helps, this machine will never be connected to a different display, so resolution flexibility isn't important. Here's the output of lshw: *-display:0 description: VGA compatible controller product: 4 Series Chipset Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 03 width: 64 bits clock: 33MHz capabilities: vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:42 memory:fe800000-febfffff memory:d0000000-dfffffff ioport:ecd8(size=8) *-display:1 UNCLAIMED description: Display controller product: 4 Series Chipset Integrated Graphics Controller vendor: Intel Corporation physical id: 2.1 bus info: pci@0000:00:02.1 version: 03 width: 64 bits clock: 33MHz According to the system settings, my graphics driver is unknown and my "experience" is standard. This is 64-bit Ubuntu 12.04 (Precise) Note: There are a number of similar questions to this one, but they didn't include any answers that helped me. Update After posting this question, I noticed one in the sidebar that I hadn't found through search but which appeared to contain the answer. Based on that question, I created the /etc/X11/xorg.conf file below: Section "ServerLayout" Identifier "X.org Configured" Screen 0 "Screen0" 0 0 InputDevice "Mouse0" "CorePointer" InputDevice "Keyboard0" "CoreKeyboard" EndSection Section "Files" ModulePath "/usr/lib/xorg/modules" FontPath "/usr/share/fonts/X11/misc" FontPath "/var/lib/defoma/x-ttcidfont-conf.d/dirs/TrueType" FontPath "built-ins" EndSection Section "Module" Load "glx" Load "dri2" Load "dbe" Load "dri" Load "record" Load "extmod" EndSection Section "InputDevice" Identifier "Keyboard0" Driver "kbd" EndSection Section "InputDevice" Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/input/mice" Option "ZAxisMapping" "4 5 6 7" EndSection Section "Monitor" Identifier "Monitor0" VendorName "LG" ModelName "Smart TV" EndSection Section "Device" ### Available Driver options are:- ### Values: <i>: integer, <f>: float, <bool>: "True"/"False", ### <string>: "String", <freq>: "<f> Hz/kHz/MHz", ### <percent>: "<f>%" ### [arg]: arg optional #Option "DRI" # [<bool>] #Option "ColorKey" # <i> #Option "VideoKey" # <i> #Option "FallbackDebug" # [<bool>] #Option "Tiling" # [<bool>] #Option "LinearFramebuffer" # [<bool>] #Option "Shadow" # [<bool>] #Option "SwapbuffersWait" # [<bool>] #Option "TripleBuffer" # [<bool>] #Option "XvMC" # [<bool>] #Option "XvPreferOverlay" # [<bool>] #Option "DebugFlushBatches" # [<bool>] #Option "DebugFlushCaches" # [<bool>] #Option "DebugWait" # [<bool>] #Option "HotPlug" # [<bool>] #Option "RelaxedFencing" # [<bool>] Identifier "Card0" Driver "intel" BusID "PCI:0:2:0" EndSection Section "Screen" Identifier "Screen0" Device "Card0" Monitor "Monitor0" DefaultDepth 24 #SubSection "Display" # Viewport 0 0 # Depth 1 #EndSubSection #SubSection "Display" # Viewport 0 0 # Depth 4 #EndSubSection #SubSection "Display" # Viewport 0 0 # Depth 8 #EndSubSection #SubSection "Display" # Viewport 0 0 # Depth 15 #EndSubSection #SubSection "Display" # Viewport 0 0 # Depth 16 #EndSubSection SubSection "Display" Viewport 0 0 Depth 24 Modes "1024x768" "1920x1080" EndSubSection EndSection According to /var/log/Xorg.0.log, my settings aren't being applied. In fact, I wonder if the config file is even being read. [ 1209.083] (**) intel(0): Depth 24, (--) framebuffer bpp 32 [ 1209.084] (==) intel(0): RGB weight 888 [ 1209.084] (==) intel(0): Default visual is TrueColor [ 1209.084] (II) intel(0): Integrated Graphics Chipset: Intel(R) G41 [ 1209.084] (--) intel(0): Chipset: "G41" [ 1209.084] (**) intel(0): Relaxed fencing enabled [ 1209.084] (**) intel(0): Wait on SwapBuffers? enabled [ 1209.084] (**) intel(0): Triple buffering? enabled [ 1209.084] (**) intel(0): Framebuffer tiled [ 1209.084] (**) intel(0): Pixmaps tiled [ 1209.084] (**) intel(0): 3D buffers tiled [ 1209.084] (**) intel(0): SwapBuffers wait enabled [ 1209.084] (==) intel(0): video overlay key set to 0x101fe [ 1209.172] (II) intel(0): Output VGA1 using monitor section Monitor0 [ 1209.260] (II) intel(0): EDID for output VGA1 [ 1209.260] (II) intel(0): Printing probed modes for output VGA1 [ 1209.260] (II) intel(0): Modeline "1024x768"x60.0 65.00 1024 1048 1184 1344 768 771 777 806 -hsync -vsync (48.4 kHz) [ 1209.260] (II) intel(0): Modeline "800x600"x60.3 40.00 800 840 968 1056 600 601 605 628 +hsync +vsync (37.9 kHz) [ 1209.260] (II) intel(0): Modeline "800x600"x56.2 36.00 800 824 896 1024 600 601 603 625 +hsync +vsync (35.2 kHz) [ 1209.260] (II) intel(0): Modeline "848x480"x60.0 33.75 848 864 976 1088 480 486 494 517 +hsync +vsync (31.0 kHz) [ 1209.260] (II) intel(0): Modeline "640x480"x59.9 25.18 640 656 752 800 480 489 492 525 -hsync -vsync (31.5 kHz) [ 1209.260] (II) intel(0): Output VGA1 connected [ 1209.260] (II) intel(0): Using user preference for initial modes [ 1209.260] (II) intel(0): Output VGA1 using initial mode 1024x768 [ 1209.260] (II) intel(0): Using default gamma of (1.0, 1.0, 1.0) unless otherwise stated. [ 1209.260] (II) intel(0): Kernel page flipping support detected, enabling [ 1209.260] (==) intel(0): DPI set to (96, 96)

    Read the article

  • Changing DisplayMode seems not to update Input&Graphic Dimension

    - by coding.mof
    I'm writing a small game using Slick and Nifty-GUI. At the program startup I set the DisplayMode using the following lines: AppGameContainer app = new ... app.setDisplayMode( 800, 600, false ); app.start(); I wrote a Nifty-ScreenController for my settings dialog in which the user can select the desired DisplayMode. When I try to set the new DisplayMode within this controller class the game window gets resized correctly but the Graphics and Input objects aren't updated accordingly. Therefore my rendering code just uses a part of the new window. I tried to set different DisplayModes in the main method to test if it's generally possible to invoke this method multiple times. It seems that changing the DisplayMode only works before I call app.start(). Furthermore I tried to update the Graphics & Input object manually but the init and setDimensions methods are package private. :( Does someone know what I'm doing wrong and how to change the DisplayMode correctly?

    Read the article

  • nvidia optimus laptops

    - by kellogs
    kellogs@kellogs-K52Jc ~ $ lspci 00:00.0 Host bridge: Intel Corporation Core Processor DRAM Controller (rev 18) 00:02.0 VGA compatible controller: Intel Corporation Core Processor Integrated Graphics Controller (rev 18) 00:16.0 Communication controller: Intel Corporation 5 Series/3400 Series Chipset HECI Controller (rev 06) 00:1a.0 USB controller: Intel Corporation 5 Series/3400 Series Chipset USB2 Enhanced Host Controller (rev 06) 00:1b.0 Audio device: Intel Corporation 5 Series/3400 Series Chipset High Definition Audio (rev 06) 00:1c.0 PCI bridge: Intel Corporation 5 Series/3400 Series Chipset PCI Express Root Port 1 (rev 06) 00:1c.1 PCI bridge: Intel Corporation 5 Series/3400 Series Chipset PCI Express Root Port 2 (rev 06) 00:1c.5 PCI bridge: Intel Corporation 5 Series/3400 Series Chipset PCI Express Root Port 6 (rev 06) 00:1d.0 USB controller: Intel Corporation 5 Series/3400 Series Chipset USB2 Enhanced Host Controller (rev 06) 00:1e.0 PCI bridge: Intel Corporation 82801 Mobile PCI Bridge (rev a6) 00:1f.0 ISA bridge: Intel Corporation Mobile 5 Series Chipset LPC Interface Controller (rev 06) 00:1f.2 SATA controller: Intel Corporation 5 Series/3400 Series Chipset 4 port SATA AHCI Controller (rev 06) 00:1f.6 Signal processing controller: Intel Corporation 5 Series/3400 Series Chipset Thermal Subsystem (rev 06) 02:00.0 Network controller: Atheros Communications Inc. AR9285 Wireless Network Adapter (PCI-Express) (rev 01) 03:00.0 System peripheral: JMicron Technology Corp. SD/MMC Host Controller (rev 80) 03:00.2 SD Host controller: JMicron Technology Corp. Standard SD Host Controller (rev 80) 03:00.3 System peripheral: JMicron Technology Corp. MS Host Controller (rev 80) 03:00.4 System peripheral: JMicron Technology Corp. xD Host Controller (rev 80) 03:00.5 Ethernet controller: JMicron Technology Corp. JMC250 PCI Express Gigabit Ethernet Controller (rev 03) ff:00.0 Host bridge: Intel Corporation Core Processor QuickPath Architecture Generic Non-core Registers (rev 05) ff:00.1 Host bridge: Intel Corporation Core Processor QuickPath Architecture System Address Decoder (rev 05) ff:02.0 Host bridge: Intel Corporation Core Processor QPI Link 0 (rev 05) ff:02.1 Host bridge: Intel Corporation Core Processor QPI Physical 0 (rev 05) ff:02.2 Host bridge: Intel Corporation Core Processor Reserved (rev 05) ff:02.3 Host bridge: Intel Corporation Core Processor Reserved (rev 05) kellogs@kellogs-K52Jc ~ $ inxi -SGx System: Host: kellogs-K52Jc Kernel: 3.5.0-17-generic x86_64 (64 bit, gcc: 4.7.2) Desktop: KDE 4.9.5 (Qt 4.8.3) Distro: Linux Mint 14 Nadia Graphics: Card: Intel Core Processor Integrated Graphics Controller bus-ID: 00:02.0 X.Org: 1.13.0 drivers: intel (unloaded: fbdev,vesa) Resolution: [email protected] GLX Renderer: Mesa DRI Intel Ironlake Mobile GLX Version: 2.1 Mesa 9.0.3 Direct Rendering: Yes Manufacturer advertises the K52Jc model which I bought as optimus enabled. However, no traces of it in the output above. Of course, Bumblebee would not start on this machine. Should I rest assured that is a defective / un-optimused machine ? thanks

    Read the article

  • 11.10 runs really choppy and slow, but not if I choose "Gnome Classic"

    - by Ingram
    I'd like to use Unity with Ubuntu 11.10, but everything from Unity runs really choppy and slow. I have an ATI graphics card and I have installed the drivers through "Additional Drivers". The drivers work, as I can play 3d games flawlessly. When I drag the mouse box around or drag windows around, it is really choppy. I previously had Ubuntu 10.10 and everything worked fine. I installed gnome 3 on 11.10 and it does the same thing Unity does, very choppy and slow graphics. However, if I choose Gnome Classic, everything is fine. I can drag the mouse box all around with no problems. I can drag windows around and it looks and feels great. Is this a bug with Unity? Are others experiencing this? Or is there something I can do to fix this?

    Read the article

  • Designing a game - Where to start?

    - by OghmaOsiris
    A friend of mine and I are planning a game together to work on in our free time. It's not an extensive game, but it's not a simple one either. He's working on the story behind the game while I'm working on the graphics and code. I don't really know where to start with the game. We know what the basic type of game it's going to be and how it would be played, but I'm having a hard time of actually knowing where to begin. I have Xcode open but I don't really even know what I should be designing first. What is some advice for this writer's block? Where is a good place to start with a game? Should I design all the graphics and layout before even touching Xcode? Should I program the things I know I'll have difficulty with first before getting to the easy stuff?

    Read the article

  • Basics of drawing in 2d with OpenGL 3 shaders

    - by davidism
    I am new to OpenGL 3 and graphics programming, and want to create some basic 2d graphics. I have the following scenario of how I might go about drawing a basic (but general) 2d rectangle. I'm not sure if this is the correct way to think about it, or, if it is, how to implement it. In my head, here's how I imagine doing it: t = make_rectangle(width, height) build general VBO, centered at 0, 0 optionally: t.set_scale(2) optionally: t.set_angle(30) t.draw_at(x, y) calculates some sort of scale/rotate/translate matrix (or matrices), passes the VBO and the matrix to a shader program Something happens to clip the world to the view visible on screen. I'm really unclear on how 4 and 5 will work. The main problem is that all the tutorials I find either: use fixed function pipeline, are for 3d, or are unclear how to do something this "simple". Can someone provide me with either a better way to think of / do this, or some concrete code detailing performing the transformations in a shader and constructing and passing the data required for this shader transformation?

    Read the article

  • Cannot login via Unity login screen after upgrade to 12.04

    - by codesurgeon
    Logging in via the shell accessed through Ctrl+Alt-F1 and logging in as guest via the graphical user interface work 0O When I try to log into my standard user account via the graphical interface, the screen flashes to black for a couple of seconds and bumps me back to a pristine login screen. Entering a wrong password for my user account yields the standard error message - my user account and credential verification seem to be OK. I suppose that my individual graphics configuration causes problems ... I'm not sure how to reset that. I've tried stopping the UI via sudo service lightdm stop executed sudo nvidia-xconfig and restarted the UI sudo service lightdm start to no avail. My workstation has a Nvidia GeForce 560-448 graphics card. I've tried getting this fixed with the latest Nvidia 64-bit drivers (cURL'ed from the official website), that is 295.49 and the latest beta driver 302.07. Anybody have an idea how to get this fixed? Your help is appreciated :)

    Read the article

  • GPU On-the-fly encoding of video through Logitech HD Webcam C510

    - by Ashfame
    Originally asked here but I edited that to move this question as a different one on suggestion by a fellow member. I read that I should have a Core2Duo 2.2Ghz for 720p but I have a 2.0Ghz one, would it be possible for me to first record it and then encode it after recording if my processor really start giving issue when doing on-the-fly encoding? I also have a ATI HD 4850 512MB card, if it can help in encoding on-the-fly or is there a chance that my graphics card alone can handle it and those specs were just for a system without a graphics card? I believe so. Also, I got no worries in dealing with console, if I have to do some of the things above in terminal. Other possible significant details: I have a dual screen setup 29" (1360X768) & 22" (1680X1050) which might be using some good power from GPU and I have 2GB DDR2 800Mhz RAM.

    Read the article

  • Problem when scaling game screen in Libgdx

    - by Nicolas Martel
    Currently, I'm able to scale the screen by applying this bit of code onto an OrthographicCamera Camera.setToOrtho(true, Gdx.graphics.getWidth() / 2, Gdx.graphics.getHeight() / 2); But something quite strange is happening with this solution, take a look at this picture of my game below Seems fine right? But upon further investigation, many components are rendered off by one pixels, and the tiles all are. Take a closer look I circled a couple of the errors. Note that the shadow of the warrior I circled appears fine for the other warriors. Also keep in mind that everything is rendered at pixel-perfect precision when I disable the scaling. I actually thought of a possible source for the problem as I'm writing this but I decided to still post this because I would assume somebody else might run into the same issue.

    Read the article

  • Ubuntu 13.10 - Unity menu very slow and unresponsive

    - by VukBG
    I have a HP G62 laptop with Intel Pentium processor and ATI graphics. Since upgrade to Ubuntu 13.10, my Unity menu is very slow and unresponsive. I am using proprietary driver for my graphics For example, I'm switching between Gimp and Chrome. When I switch to Gimp, I will be stuck with Chrome menu at the top of the screen. It is really annoying and it is just one of many bugs that came up with 13.10. Any ideas? Should I just revert to default driver?

    Read the article

  • glx not working, optirun works, gnome in fallback mode

    - by user26766
    It all started when I installed nvidia's own driver. Uninstalling it and reverting back to nvidia-current didn't solve the problem, so I have been playing with this for a while. Now nvidia-current seems to be functional. It seems glx support is missing, and my intel graphics is not responding. gnome loads only in fallback mode. Here are some outputs: glxinfo name of display: :0.0 Error: couldn't find RGB GLX visual or fbconfig glxgears Error: couldn't get an RGB, Double-buffered visual optirun glxgears works fine lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation GF108 [GeForce GT 540M] (rev ff) how can I fix this? thanks,

    Read the article

  • Oculus Rift with Antichamber

    - by Scott Hainline
    Antichamber runs great on linux (steam version). But it is not playable with the Oculus Rift at this point. The issues are: 1) no headtracking 2) graphics are not being split and distorted by Oculus SDK My current plan is to use LD_PRELOAD to add the functionality, this seems to be the linux equivalent of DLL injection. Antichamber appears to be using SDL, I'm hoping this can be configured to use the headtracking data as a joystick and apply the graphics distortion, but I am not sure which functions I should be looking for. Is there a simpler way of getting these issues resolved? Is SDL the right choice here? Would appreciate any information on how the Unreal Engine 3 works under linux; and library injection too.

    Read the article

  • Has the Ubuntu heating problem for Sony Vaio users been solved?

    - by nischayn22
    I use Sony Vaio VPCEA23en with graphics card ATI Radeon HD 5145 and have been using Ubuntu 11.10 and recently upgraded to 12.04 beta, however the problem of overheating (60-70) still persists. I have installed the graphics driver properly. Are there some features of Ubuntu that cause this problem? I would have no problem uninstalling them; Or will using a lighter version of Ubuntu (lubuntu) solve this problem? Right now I am using Win7 and would like to switch to Ubuntu ASAP.

    Read the article

  • Ubuntu 12.10 live-usb won't boot

    - by user109175
    I own an Aleutia "Tango" low power PC. It uses a Intel Atom CPU N2800 @1.86GHz processor. I know there are Linux issues with the "Cedar Trail" graphics on this hardware but I can happily run Ubuntu 12.04 using the VESA graphics driver. I wanted to try out Ubuntu 12.10 so I created a live USB under 12.04 but I am unable to get this to boot. It gets as far as the Grub screen but after that the monitor shuts down. I have tried the F6 "nomodeset" option but that doesn't make any difference. Does anyone have any knowledge of this problem? Thanks in advance.

    Read the article

  • how to deal with parallel programming

    - by nkint
    Hi. I know that parallel programming is a big resource in computer graphics, with moder machines, and mayebe a computing model that will be grow up in the near future (is this trend true?). I want to know what is the best way to deal with it. there is some practical general purpose usefulness in studying processor n-dimensional mesh, or bitonic sort in p-ram machines or it's only theory for domain specific hardware used in real particular signal elaborations of scientific simulations? Is this the best way to acquire the know how for how to become acquainted with cuda or opencl? (i'm interested in computer graphics applications) and why functional programming is so important to understand parallel computing? ps: as someone has advice me i have forked this discussion from http://stackoverflow.com/questions/4908677/how-to-deal-with-parallel-programming

    Read the article

  • how to launch grub menu for ubuntu guests in virtualbox?

    - by Ubuntuser
    I have ubuntu 12.04 alpha installed in virtualbox. When the virtual machine is started, it boots directly to the login screen without showing the grub menu. How can I get the grub menu to show up on start? Please note: the graphics is broken after recent updates, so i cannot login and make the changes. it is stuck at the screen "Ubuntu is running on low graphics mode" and the mouse and other keys do not work. [URL=http://img862.imageshack.us/i/screenshotat20120127171.png/][IMG]http://img862.imageshack.us/img862/9783/screenshotat20120127171.th.png[/IMG][/URL]

    Read the article

  • Ubuntu 12.04 nVidia TwinView tearing

    - by Andy Turfer
    I'm running Ubuntu 12.04. Today I just purchased a second monitor (I have a Dell U2711 and a 42-inch Sony TV). I'm running the 295.49 proprietary nVidia graphics driver, my graphics card is a GeForce GTX 460. I am not able to activate "xinerama" with "separate X screens", not sure why. I am able to use TwinView. This works, although it results in severe tearing on the Sony TV. It's as though "sync to vblank" isn't working (I have this set in Compiz and in the nVidia settings). If I disable the Dell U2711 and only use the Sony, everything works fine, zero tearing. Is it the case that "sync to vblank" can only work on a single monitor in Ubuntu 12.04? Is there any way to get rid of the tearing on the second monitor in TwinView mode?

    Read the article

  • problems with graphic performance in Ubuntu 12.04

    - by Falk
    in advance: my English is not perfect ;) processor: Intel® Core™2 Duo CPU E8500 @ 3.16GHz × 2 memory: 3,9 GiB graphics: GeForce GTX 460 Ubuntu 12.04 32-Bit fresh installed (previously 11.10 32-Bit) driver: NVIDIAs accelerated Driver (Version current-updates) probelm: In Ubuntu 12.04 I have problems with the performance of the graphics (games for example: Volley Brwawl, Neverball, Beep and Minecraft), and effects of windows (always at minimize, maximize...) There are laggs at animations, and the laggs in games are extremely annoying. Problems occur only in Ubuntu 12.04, everything was good in 11.10. No Problems with YouTube videos, HD videos. Is there a solution? BBecause I have found nothing here in the Forum and on Google. Or is an update coming soonfor driver or whatever? (This Bug is already registered in Launchpad.) Thank you!

    Read the article

  • How can I make KDE faster in Ubuntu 12.04. It's very slow

    - by Rizwan Rifan
    I installed the kubuntu-desktop package in Ubuntu 12.04 LTS, but the problem is KDE responses very slowly. If I click on an application's icon to run it, it appears after 10 seconds and sometimes does not appear at all. It hangs all the time. The cursor is almost impossible to follow because of the lag. I have read on the Internet that Unity uses more memory and CPU than KDE. But on my PC Unity runs smoothly and KDE does not. So what should I do to make KDE as fast, responsive and smooth as Unity? My specifications are as follows: RAM: 1.5 GB (DDR2) Processor: 3 GHz Dual Core Graphics Card: Intel HD graphics with 256 MB memory.

    Read the article

  • Boot splash broken by "SP5100 TCO timer: mmio address 0xyyyyyyy already in use"

    - by mogliii
    I have ubuntu 11.04 with all the latest updates. I have an ATI HD 4350 graphics card and the "ATI/AMD proprietary FGLRX graphics driver" activated. The reported behaviour does not affect the functionality, its just an optical thing. When I booted up using the desktop CD, the ubuntu boot splash was shown correctly in high resolution. Now after installation with FGLRX the dipsplay is broken (see picture). http://img824.imageshack.us/img824/7269/tcotimer.jpg This is what can be found in dmesg [ 8.621803] SP5100 TCO timer: SP5100 TCO WatchDog Timer Driver v0.01 [ 8.621967] SP5100 TCO timer: mmio address 0xfec000f0 already in use [ 8.622650] fglrx: module license 'Proprietary. (C) 2002 - ATI Technologies, Starnberg, GERMANY' taints kernel. [ 8.622656] Disabling lock debugging due to kernel taint This is what MMIO means: https://en.wikipedia.org/wiki/Memory-mapped_I/O Any idea how to get back the high-res splash?

    Read the article

  • Android: Showing photos runs out of memory

    - by Tom Beech
    I'm using a dialog box to display images in my android project. The first one opens fine, but when I close it and do the process again to show a different one the app falls over with a memory error (it's running on a samsung galaxy s3 - so shouldnt be an issue). Error: 10-24 11:25:45.575: E/dalvikvm-heap(29194): Out of memory on a 31961104-byte allocation. 10-24 11:25:45.580: E/AndroidRuntime(29194): FATAL EXCEPTION: main 10-24 11:25:45.580: E/AndroidRuntime(29194): java.lang.OutOfMemoryError 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.graphics.BitmapFactory.nativeDecodeStream(Native Method) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.graphics.BitmapFactory.decodeStream(BitmapFactory.java:587) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.graphics.BitmapFactory.decodeFile(BitmapFactory.java:389) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.graphics.BitmapFactory.decodeFile(BitmapFactory.java:418) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.graphics.drawable.Drawable.createFromPath(Drawable.java:882) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.widget.ImageView.resolveUri(ImageView.java:569) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.widget.ImageView.setImageURI(ImageView.java:340) 10-24 11:25:45.580: E/AndroidRuntime(29194): at com.directenquiries.assessment.tool.AddAsset.loadPhoto(AddAsset.java:771) 10-24 11:25:45.580: E/AndroidRuntime(29194): at com.directenquiries.assessment.tool.AddAsset$11.onClick(AddAsset.java:748) 10-24 11:25:45.580: E/AndroidRuntime(29194): at com.android.internal.app.AlertController$AlertParams$3.onItemClick(AlertController.java:936) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.widget.AdapterView.performItemClick(AdapterView.java:292) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.widget.AbsListView.performItemClick(AbsListView.java:1359) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.widget.AbsListView$PerformClick.run(AbsListView.java:2988) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.widget.AbsListView$1.run(AbsListView.java:3783) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.os.Handler.handleCallback(Handler.java:605) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.os.Handler.dispatchMessage(Handler.java:92) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.os.Looper.loop(Looper.java:137) 10-24 11:25:45.580: E/AndroidRuntime(29194): at android.app.ActivityThread.main(ActivityThread.java:4517) 10-24 11:25:45.580: E/AndroidRuntime(29194): at java.lang.reflect.Method.invokeNative(Native Method) 10-24 11:25:45.580: E/AndroidRuntime(29194): at java.lang.reflect.Method.invoke(Method.java:511) 10-24 11:25:45.580: E/AndroidRuntime(29194): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:993) 10-24 11:25:45.580: E/AndroidRuntime(29194): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:760) 10-24 11:25:45.580: E/AndroidRuntime(29194): at dalvik.system.NativeStart.main(Native Method) Loading code: public void loadPhotoList(){ Cursor f = db.rawQuery("select * from stationphotos where StationObjectID = '"+ checkStationObjectID + "'", null); final ArrayList<String> mHelperNames= new ArrayList<String>(); if(f.getCount() != 0) { f.moveToFirst(); f.moveToFirst(); while(!f.isAfterLast()) { mHelperNames.add(f.getString(f.getColumnIndex("FilePath"))); f.moveToNext(); } } f.close(); final String [] nameStrings = new String [mHelperNames.size()]; for(int i=0; i<mHelperNames.size(); i++) nameStrings[i] = mHelperNames.get(i).toString(); AlertDialog.Builder builder = new AlertDialog.Builder(this); builder.setTitle("Select Picture"); builder.setItems(nameStrings, new DialogInterface.OnClickListener() { public void onClick(DialogInterface dialog, int item) { loadPhoto(mHelperNames.get(item).toString()); } }); AlertDialog alert = builder.create(); alert.show(); } public void loadPhoto(String imagepath){ Dialog dialog = new Dialog(this); dialog.setContentView(R.layout.activity_show_image); dialog.setTitle("Image"); dialog.setCancelable(true); ImageView img = (ImageView) dialog.findViewById(R.id.imageView1); img.setImageResource(R.drawable.ico_partial); Uri imgUri = Uri.parse(imagepath); img.setImageURI(imgUri); dialog.show(); }

    Read the article

  • Choppy window movement in Gnome 3.4 on Ubuntu 12.04

    - by mjrussell
    I have been using Gnome 3.2 since Ubuntu 11.10 was released and it has always been perfectly smooth and performed extremely well, much better than Unity. After doing a clean installation of Ubuntu 12.04, Gnome 3.4 performs less well. If just one window of a relatively simple application, such as Gnome Terminal, is opened and moved around, the movement is sometimes very choppy, but the rest of the time it's perfectly smooth. The times when it's choppy seem to be when part of the window goes off the bottom or right side of the screen. Also, if there are multiple windows open, it is almost always choppy. These facts suggest to me that it's something to do with the compositor. Unity works perfectly smoothly. Memory usage is only at about 500-600MB, out of 3GB, even with a few things open. The graphics card is the on-board Intel graphics on the Core i5 M450. Does anyone have any ideas what might be causing this problem? Thanks

    Read the article

< Previous Page | 96 97 98 99 100 101 102 103 104 105 106 107  | Next Page >