Search Results

Search found 5873 results on 235 pages for 'raster graphics'.

Page 48/235 | < Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >

  • Advantages/disadvantages of browser-based interface vs. graphics

    - by Josh
    Hello everyone, I'm in the design phase for a desktop-based application. Because of the nature of this particular application, I believe it would benefit greatly from a web-based approach (i.e., allowing a user to interface with the application through a browser running in kiosk mode) in order to leverage the simplicity of HTML/CSS/JS and the availability of many great JS interface plugins. Does taking this approach (rather than coding in a native or cross-platform graphics library) come with any gotchas?

    Read the article

  • Java - FontMetrics without Graphics

    - by subSeven
    Hello! How to get FontMetrics without use Graphics ? I want to get FontMetrics in constructor, now I do this way: BufferedImage bi = new BufferedImage(5, 5, BufferedImage.TYPE_INT_RGB); FontMetrics fm = bi.getGraphics().getFontMetrics(font); int width = fm.stringWidth(pattern); int height = fm.getHeight();

    Read the article

  • Help with simple frame, and graphics in Java

    - by Crystal
    For hw, I'm trying to create a "CustomButton" that has a frame and in that frame, I draw two triangles, and a square over it. It's supposed to give the user the effect of a button press once it is depressed. So for starters, I am trying to set up the beginning graphics, drawing two triangles, and a square. The problem I have is although I set my frame to 200, 200, and the triangles I have drawn I think to the correct ends of my frame size, when I run the program, I have to extend my window to make the whole artwork, my "CustomButton," viewable. Is that normal? Thanks. Code: import java.awt.*; import java.awt.event.*; import javax.swing.*; public class CustomButton { public static void main(String[] args) { EventQueue.invokeLater(new Runnable() { public void run() { CustomButtonFrame frame = new CustomButtonFrame(); frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); frame.setVisible(true); } }); } } class CustomButtonFrame extends JFrame { // constructor for CustomButtonFrame public CustomButtonFrame() { setTitle("Custom Button"); setSize(DEFAULT_WIDTH, DEFAULT_HEIGHT); CustomButtonSetup buttonSetup = new CustomButtonSetup(); this.add(buttonSetup); } private static final int DEFAULT_WIDTH = 200; private static final int DEFAULT_HEIGHT = 200; } class CustomButtonSetup extends JComponent { public void paintComponent(Graphics g) { Graphics2D g2 = (Graphics2D) g; // first triangle coords int x[] = new int[TRIANGLE_SIDES]; int y[] = new int[TRIANGLE_SIDES]; x[0] = 0; y[0] = 0; x[1] = 200; y[1] = 0; x[2] = 0; y[2] = 200; Polygon firstTriangle = new Polygon(x, y, TRIANGLE_SIDES); // second triangle coords x[0] = 0; y[0] = 200; x[1] = 200; y[1] = 200; x[2] = 200; y[2] = 0; Polygon secondTriangle = new Polygon(x, y, TRIANGLE_SIDES); g2.drawPolygon(firstTriangle); g2.setColor(Color.WHITE); g2.fillPolygon(firstTriangle); g2.drawPolygon(secondTriangle); g2.setColor(Color.GRAY); g2.fillPolygon(secondTriangle); // draw rectangle 10 pixels off border g2.drawRect(10, 10, 180, 180); } public static final int TRIANGLE_SIDES = 3; }

    Read the article

  • Pseudo graphics in C++

    - by xae
    Hello, i looks for pseudo graphics lib for C++ under Visual Studio, lib like in TurboVision in Borland Turbo C++ or NCurser.. Or can i do it with standart c++ libs ? Thanks.

    Read the article

  • My 3D Games crashing in these scenarios

    - by desaivv
    I have a situation here which I am unable to solve. I bought a PC last year March, here are my specs: Intel Core i3 550 @ 3GHz 4 GB RAM @400 MHz XFX GeForce 9500 GT graphics card 1GB @550MHz 500 GB HDD Lately as soon as I load my save game of Skyrim, it crashes. I have been playing Skyrim since I joined Gaming.SE site. Crashes as in the entire scene gets red lines. I can not ALT + TAB back or even CTRL + ALT + DEL either. My only recourse is a hard reset via the power button. Can not take a screen shot either. I have the latest Forceware 296.10 drivers also. This has been happening since the last 2 weeks. I always use Driver Sweeper to clean my old drivers, since that is what XFXForce recommends before installing new drivers. I installed MSI Afterburner lately to see my GPU temperature. My GPU is default, never over-clocked it. In MSI's Afterburner, I can not adjust fan speed. It is greyed out. Also in settings there is no fan tab. With normal Internet browsing it stays at 51 C. Ran Memtest86 over night with level 11. Took about 13 hours, but no errors in my RAM. I even re-installed my OS, with just the 296 drivers. The fan for the GPU does come on. I can play Diablo 2. I can not get past Warcraft 3's menu selection. There WAS some dust in my machine, but I always try and keep everything clean, since in my home town dust is an issue. Always keep cool my entire PC cabinet. My friend came with his functioning graphics card, we bought our PCs at the same time with exact same specifications. His card did not work either. Same problem with the scene freezing with red lines. I did do my research before posting here. That is how I was able to learn about MSI Afterburner, Driver Sweeper, SpeedFan etc. I followed posts on Tom's Hardware too regarding people that had similar problems. One person suggested and was followed by worked as well the suggestion to "Bake the card in an oven". Since I bought it, played Diablo 2 for months, Starcraft 2 campaign for months and Skyrim recently for months. Bought ME3 also. I am at my wits end. I do not know what else to do. I can go out and buy a card, but my friend's card did not work either. I can use the machine for Eclipse or VS2010 development just fine. Just not with 3D gaming. I originally posted this question on Gaming.SE But I was directed here. I have browsed the SU database for my problem and found this, this, and this. But none of these cover my question. My machine is only one year old. Can some experienced superuser(s) shed some light on this scenario? Is it a 3D graphics card problem? Will a brand new card work? What else can I try to pin point the problem? Can it be the Motherboard? Thanks.

    Read the article

  • How to get Nvidia graphics working on Sony Z laptop?

    - by projectshave
    I have an older Sony VAIO Z 590 laptop with switchable graphics between Intel and Nvidia GeForce 9300M. It is NOT Optimus. I did a clean install of Ubuntu 12.04. Everything works, but it's using Unity 2D with the Intel drivers. I've tried loading the Nvidia drivers from "Additional Drivers", but it says "this driver is activated but not currently in use". When I run "nvidia-settings", an error window pops up to say "You do not appear to be using the NVIDIA X drivers." "lspci" shows both graphics cards. Let me know if I should add more info. How do I get the Nvidia graphics and Unity 3D working? More info: $ lshw -short -class display H/W path Device Class Description ============================================== /0/100/1/0 display G98 [GeForce 9300M GS] /0/100/2 display Mobile 4 Series Chipset Integrated Graphics C $ glxinfo name of display: :0 Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Error: couldn't find RGB GLX visual or fbconfig Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Excerpts from Xorg.0.log: [ 16.373] (II) LoadModule: "glx" [ 16.373] (II) Loading /usr/lib/x86_64-linux-gnu/xorg/extra-modules/libglx.so [ 16.386] (II) Module glx: vendor="NVIDIA Corporation" [ 16.386] compiled for 4.0.2, module version = 1.0.0 [ 16.386] Module class: X.Org Server Extension [ 16.386] (II) NVIDIA GLX Module 295.49 Tue May 1 00:09:10 PDT 2012 [ 16.608] (II) NVIDIA dlloader X Driver 295.49 Mon Apr 30 23:48:24 PDT 2012 [ 16.608] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs [ 17.693] (EE) Failed to initialize GLX extension (Compatible NVIDIA X driver not found)

    Read the article

  • Ubuntu 10.04 not detecting multiple monitors

    - by user28837
    I have 2 graphics cards, the output from the lspci: 01:00.0 VGA compatible controller: ATI Technologies Inc RV770 [Radeon HD 4850] 02:00.0 VGA compatible controller: ATI Technologies Inc RV710 [Radeon HD 4350] I have one monitor connected to the 4850 and 2 connected to the 4350. However when I go into System Preferences Monitors the only monitor shown is the one connected to the 4850. Is there something I need to enable for it to be able to use the other card? How do I get this to work. Thanks. As per request: X.Org X Server 1.7.6 Release Date: 2010-03-17 X Protocol Version 11, Revision 0 Build Operating System: Linux 2.6.24-25-server i686 Ubuntu Current Operating System: Linux jeff-desktop 2.6.32-22-generic-pae #33-Ubuntu SMP Wed Apr 28 14:57:29 UTC 2010 i686 Kernel command line: BOOT_IMAGE=/boot/vmlinuz-2.6.32-22-generic-pae root=UUID=852e1013-4ed6-40fd-a462-c29087888383 ro quiet splash Build Date: 23 April 2010 05:11:50PM xorg-server 2:1.7.6-2ubuntu7 (Bryce Harrington <[email protected]>) Current version of pixman: 0.16.4 Before reporting problems, check http://wiki.x.org to make sure that you have the latest version. Markers: (--) probed, (**) from config file, (==) default setting, (++) from command line, (!!) notice, (II) informational, (WW) warning, (EE) error, (NI) not implemented, (??) unknown. (==) Log file: "/var/log/Xorg.0.log", Time: Tue May 11 08:24:52 2010 (==) Using config file: "/etc/X11/xorg.conf" (==) Using config directory: "/usr/lib/X11/xorg.conf.d" (==) No Layout section. Using the first Screen section. (**) |-->Screen "Default Screen" (0) (**) | |-->Monitor "<default monitor>" (==) No device specified for screen "Default Screen". Using the first device section listed. (**) | |-->Device "Default Device" (==) No monitor specified for screen "Default Screen". Using a default monitor configuration. (==) Automatically adding devices (==) Automatically enabling devices (WW) The directory "/usr/share/fonts/X11/cyrillic" does not exist. Entry deleted from font path. (==) FontPath set to: /usr/share/fonts/X11/misc, /usr/share/fonts/X11/100dpi/:unscaled, /usr/share/fonts/X11/75dpi/:unscaled, /usr/share/fonts/X11/Type1, /usr/share/fonts/X11/100dpi, /usr/share/fonts/X11/75dpi, /var/lib/defoma/x-ttcidfont-conf.d/dirs/TrueType, built-ins (==) ModulePath set to "/usr/lib/xorg/extra-modules,/usr/lib/xorg/modules" (II) The server relies on udev to provide the list of input devices. If no devices become available, reconfigure udev or disable AutoAddDevices. (II) Loader magic: 0x81f0e80 (II) Module ABI versions: X.Org ANSI C Emulation: 0.4 X.Org Video Driver: 6.0 X.Org XInput driver : 7.0 X.Org Server Extension : 2.0 (++) using VT number 7 (--) PCI:*(0:1:0:0) 1002:9442:174b:e104 ATI Technologies Inc RV770 [Radeon HD 4850] rev 0, Mem @ 0xc0000000/268435456, 0xfe7e0000/65536, I/O @ 0x0000a000/256, BIOS @ 0x????????/131072 (--) PCI: (0:2:0:0) 1002:954f:1462:1618 ATI Technologies Inc RV710 [Radeon HD 4350] rev 0, Mem @ 0xd0000000/268435456, 0xfe8e0000/65536, I/O @ 0x0000b000/256, BIOS @ 0x????????/131072 (WW) Open ACPI failed (/var/run/acpid.socket) (No such file or directory) (II) "extmod" will be loaded by default. (II) "dbe" will be loaded by default. (II) "glx" will be loaded. This was enabled by default and also specified in the config file. (II) "record" will be loaded by default. (II) "dri" will be loaded by default. (II) "dri2" will be loaded by default. (II) LoadModule: "glx" (II) Loading /usr/lib/xorg/extra-modules/modules/extensions/libglx.so (II) Module glx: vendor="FireGL - ATI Technologies Inc." compiled for 7.5.0, module version = 1.0.0 (II) Loading extension GLX (II) LoadModule: "extmod" (II) Loading /usr/lib/xorg/modules/extensions/libextmod.so (II) Module extmod: vendor="X.Org Foundation" compiled for 1.7.6, module version = 1.0.0 Module class: X.Org Server Extension ABI class: X.Org Server Extension, version 2.0 (II) Loading extension MIT-SCREEN-SAVER (II) Loading extension XFree86-VidModeExtension (II) Loading extension XFree86-DGA (II) Loading extension DPMS (II) Loading extension XVideo (II) Loading extension XVideo-MotionCompensation (II) Loading extension X-Resource (II) LoadModule: "dbe" (II) Loading /usr/lib/xorg/modules/extensions/libdbe.so (II) Module dbe: vendor="X.Org Foundation" compiled for 1.7.6, module version = 1.0.0 Module class: X.Org Server Extension ABI class: X.Org Server Extension, version 2.0 (II) Loading extension DOUBLE-BUFFER (II) LoadModule: "record" (II) Loading /usr/lib/xorg/modules/extensions/librecord.so (II) Module record: vendor="X.Org Foundation" compiled for 1.7.6, module version = 1.13.0 Module class: X.Org Server Extension ABI class: X.Org Server Extension, version 2.0 (II) Loading extension RECORD (II) LoadModule: "dri" (II) Loading /usr/lib/xorg/modules/extensions/libdri.so (II) Module dri: vendor="X.Org Foundation" compiled for 1.7.6, module version = 1.0.0 ABI class: X.Org Server Extension, version 2.0 (II) Loading extension XFree86-DRI (II) LoadModule: "dri2" (II) Loading /usr/lib/xorg/modules/extensions/libdri2.so (II) Module dri2: vendor="X.Org Foundation" compiled for 1.7.6, module version = 1.1.0 ABI class: X.Org Server Extension, version 2.0 (II) Loading extension DRI2 (II) LoadModule: "fglrx" (II) Loading /usr/lib/xorg/extra-modules/modules/drivers/fglrx_drv.so (II) Module fglrx: vendor="FireGL - ATI Technologies Inc." compiled for 1.7.1, module version = 8.72.11 Module class: X.Org Video Driver (II) Loading sub module "fglrxdrm" (II) LoadModule: "fglrxdrm" (II) Loading /usr/lib/xorg/extra-modules/modules/linux/libfglrxdrm.so (II) Module fglrxdrm: vendor="FireGL - ATI Technologies Inc." compiled for 1.7.1, module version = 8.72.11 (II) ATI Proprietary Linux Driver Version Identifier:8.72.11 (II) ATI Proprietary Linux Driver Release Identifier: 8.723.1 (II) ATI Proprietary Linux Driver Build Date: Apr 8 2010 21:40:29 (II) Primary Device is: PCI 01@00:00:0 (WW) Falling back to old probe method for fglrx (II) Loading PCS database from /etc/ati/amdpcsdb (--) Assigning device section with no busID to primary device (WW) fglrx: No matching Device section for instance (BusID PCI:0@2:0:0) found (--) Chipset Supported AMD Graphics Processor (0x9442) found (WW) fglrx: No matching Device section for instance (BusID PCI:0@1:0:1) found (WW) fglrx: No matching Device section for instance (BusID PCI:0@2:0:1) found (**) ChipID override: 0x954F (**) Chipset Supported AMD Graphics Processor (0x954F) found (II) AMD Video driver is running on a device belonging to a group targeted for this release (II) AMD Video driver is signed (II) fglrx(0): pEnt->device->identifier=0x9428aa0 (II) pEnt->device->identifier=(nil) (II) fglrx(0): === [atiddxPreInit] === begin (II) Loading sub module "vgahw" (II) LoadModule: "vgahw" (II) Loading /usr/lib/xorg/modules/libvgahw.so (II) Module vgahw: vendor="X.Org Foundation" compiled for 1.7.6, module version = 0.1.0 ABI class: X.Org Video Driver, version 6.0 (II) fglrx(0): Creating default Display subsection in Screen section "Default Screen" for depth/fbbpp 24/32 (**) fglrx(0): Depth 24, (--) framebuffer bpp 32 (II) fglrx(0): Pixel depth = 24 bits stored in 4 bytes (32 bpp pixmaps) (==) fglrx(0): Default visual is TrueColor (==) fglrx(0): RGB weight 888 (II) fglrx(0): Using 8 bits per RGB (==) fglrx(0): Buffer Tiling is ON (II) Loading sub module "fglrxdrm" (II) LoadModule: "fglrxdrm" (II) Reloading /usr/lib/xorg/extra-modules/modules/linux/libfglrxdrm.so ukiDynamicMajor: found major device number 251 ukiDynamicMajor: found major device number 251 ukiOpenByBusid: Searching for BusID PCI:1:0:0 ukiOpenDevice: node name is /dev/ati/card0 ukiOpenDevice: open result is 10, (OK) ukiOpenByBusid: ukiOpenMinor returns 10 ukiOpenByBusid: ukiGetBusid reports PCI:2:0:0 ukiOpenDevice: node name is /dev/ati/card1 ukiOpenDevice: open result is 10, (OK) ukiOpenByBusid: ukiOpenMinor returns 10 ukiOpenByBusid: ukiGetBusid reports PCI:1:0:0 ukiDynamicMajor: found major device number 251 ukiDynamicMajor: found major device number 251 ukiOpenByBusid: Searching for BusID PCI:2:0:0 ukiOpenDevice: node name is /dev/ati/card0 ukiOpenDevice: open result is 11, (OK) ukiOpenByBusid: ukiOpenMinor returns 11 ukiOpenByBusid: ukiGetBusid reports PCI:2:0:0 (--) fglrx(0): Chipset: "ATI Radeon HD 4800 Series" (Chipset = 0x9442) (--) fglrx(0): (PciSubVendor = 0x174b, PciSubDevice = 0xe104) (==) fglrx(0): board vendor info: third party graphics adapter - NOT original ATI (--) fglrx(0): Linear framebuffer (phys) at 0xc0000000 (--) fglrx(0): MMIO registers at 0xfe7e0000 (--) fglrx(0): I/O port at 0x0000a000 (==) fglrx(0): ROM-BIOS at 0x000c0000 (II) fglrx(0): AC Adapter is used (II) fglrx(0): Primary V_BIOS segment is: 0xc000 (II) Loading sub module "vbe" (II) LoadModule: "vbe" (II) Loading /usr/lib/xorg/modules/libvbe.so (II) Module vbe: vendor="X.Org Foundation" compiled for 1.7.6, module version = 1.1.0 ABI class: X.Org Video Driver, version 6.0 (II) fglrx(0): VESA BIOS detected (II) fglrx(0): VESA VBE Version 3.0 (II) fglrx(0): VESA VBE Total Mem: 16384 kB (II) fglrx(0): VESA VBE OEM: ATI ATOMBIOS (II) fglrx(0): VESA VBE OEM Software Rev: 11.13 (II) fglrx(0): VESA VBE OEM Vendor: (C) 1988-2005, ATI Technologies Inc. (II) fglrx(0): VESA VBE OEM Product: RV770 (II) fglrx(0): VESA VBE OEM Product Rev: 01.00 (II) fglrx(0): ATI Video BIOS revision 9 or later detected (--) fglrx(0): Video RAM: 524288 kByte, Type: GDDR3 (II) fglrx(0): PCIE card detected (--) fglrx(0): Using per-process page tables (PPPT) as GART. (WW) fglrx(0): board is an unknown third party board, chipset is supported (--) fglrx(0): Chipset: "ATI Radeon HD 4300/4500 Series" (Chipset = 0x954f) (--) fglrx(0): (PciSubVendor = 0x1462, PciSubDevice = 0x1618) (==) fglrx(0): board vendor info: third party graphics adapter - NOT original ATI (--) fglrx(0): Linear framebuffer (phys) at 0xd0000000 (--) fglrx(0): MMIO registers at 0xfe8e0000 (--) fglrx(0): I/O port at 0x0000b000 (==) fglrx(0): ROM-BIOS at 0x000c0000 (II) fglrx(0): AC Adapter is used (II) fglrx(0): Invalid ATI BIOS from int10, the adapter is not VGA-enabled (II) fglrx(0): ATI Video BIOS revision 9 or later detected (--) fglrx(0): Video RAM: 524288 kByte, Type: DDR2 (II) fglrx(0): PCIE card detected (--) fglrx(0): Using per-process page tables (PPPT) as GART. (WW) fglrx(0): board is an unknown third party board, chipset is supported (II) fglrx(0): Using adapter: 1:0.0. (II) fglrx(0): [FB] MC range(MCFBBase = 0xf00000000, MCFBSize = 0x20000000) (II) fglrx(0): Interrupt handler installed at IRQ 31. (II) fglrx(0): Using adapter: 2:0.0. (II) fglrx(0): [FB] MC range(MCFBBase = 0xf00000000, MCFBSize = 0x20000000) (II) fglrx(0): RandR 1.2 support is enabled! (II) fglrx(0): RandR 1.2 rotation support is enabled! (==) fglrx(0): Center Mode is disabled (II) Loading sub module "fb" (II) LoadModule: "fb" (II) Loading /usr/lib/xorg/modules/libfb.so (II) Module fb: vendor="X.Org Foundation" compiled for 1.7.6, module version = 1.0.0 ABI class: X.Org ANSI C Emulation, version 0.4 (II) Loading sub module "ddc" (II) LoadModule: "ddc" (II) Module "ddc" already built-in (II) fglrx(0): Finished Initialize PPLIB! (II) Loading sub module "ddc" (II) LoadModule: "ddc" (II) Module "ddc" already built-in (II) fglrx(0): Connected Display0: DFP on external TMDS [tmds2] (II) fglrx(0): Display0 EDID data --------------------------- (II) fglrx(0): Manufacturer: DEL Model: a038 Serial#: 810829397 (II) fglrx(0): Year: 2008 Week: 51 (II) fglrx(0): EDID Version: 1.3 (II) fglrx(0): Digital Display Input (II) fglrx(0): Max Image Size [cm]: horiz.: 53 vert.: 30 (II) fglrx(0): Gamma: 2.20 (II) fglrx(0): DPMS capabilities: StandBy Suspend Off (II) fglrx(0): Supported color encodings: RGB 4:4:4 YCrCb 4:4:4 (II) fglrx(0): Default color space is primary color space (II) fglrx(0): First detailed timing is preferred mode (II) fglrx(0): redX: 0.640 redY: 0.330 greenX: 0.300 greenY: 0.600 (II) fglrx(0): blueX: 0.150 blueY: 0.060 whiteX: 0.312 whiteY: 0.329 (II) fglrx(0): Supported established timings: (II) fglrx(0): 720x400@70Hz (II) fglrx(0): 640x480@60Hz (II) fglrx(0): 640x480@75Hz (II) fglrx(0): 800x600@60Hz (II) fglrx(0): 800x600@75Hz (II) fglrx(0): 1024x768@60Hz (II) fglrx(0): 1024x768@75Hz (II) fglrx(0): 1280x1024@75Hz (II) fglrx(0): Manufacturer's mask: 0 (II) fglrx(0): Supported standard timings: (II) fglrx(0): #0: hsize: 1152 vsize 864 refresh: 75 vid: 20337 (II) fglrx(0): #1: hsize: 1280 vsize 1024 refresh: 60 vid: 32897 (II) fglrx(0): #2: hsize: 1920 vsize 1080 refresh: 60 vid: 49361 (II) fglrx(0): Supported detailed timing: (II) fglrx(0): clock: 148.5 MHz Image Size: 531 x 298 mm (II) fglrx(0): h_active: 1920 h_sync: 2008 h_sync_end 2052 h_blank_end 2200 h_border: 0 (II) fglrx(0): v_active: 1080 v_sync: 1084 v_sync_end 1089 v_blanking: 1125 v_border: 0 (II) fglrx(0): Serial No: Y183D8CF0TFU (II) fglrx(0): Monitor name: DELL S2409W (II) fglrx(0): Ranges: V min: 50 V max: 76 Hz, H min: 30 H max: 83 kHz, PixClock max 170 MHz (II) fglrx(0): EDID (in hex): (II) fglrx(0): 00ffffffffffff0010ac38a055465430 (II) fglrx(0): 3312010380351e78eeee91a3544c9926 (II) fglrx(0): 0f5054a54b00714f8180d1c001010101 (II) fglrx(0): 010101010101023a801871382d40582c (II) fglrx(0): 4500132a2100001e000000ff00593138 (II) fglrx(0): 3344384346305446550a000000fc0044 (II) fglrx(0): 454c4c205332343039570a20000000fd (II) fglrx(0): 00324c1e5311000a2020202020200059 (II) fglrx(0): End of Display0 EDID data -------------------- (II) fglrx(0): Output DFP2 has no monitor section (II) fglrx(0): Output DFP_EXTTMDS has no monitor section (II) fglrx(0): Output CRT1 has no monitor section (II) fglrx(0): Output CRT2 has no monitor section (II) fglrx(0): Output DFP2 disconnected (II) fglrx(0): Output DFP_EXTTMDS connected (II) fglrx(0): Output CRT1 disconnected (II) fglrx(0): Output CRT2 disconnected (II) fglrx(0): Using exact sizes for initial modes (II) fglrx(0): Output DFP_EXTTMDS using initial mode 1920x1080 (II) fglrx(0): DPI set to (96, 96) (II) fglrx(0): Adapter ATI Radeon HD 4800 Series has 2 configurable heads and 1 displays connected. (==) fglrx(0): QBS disabled (==) fglrx(0): PseudoColor visuals disabled (II) Loading sub module "ramdac" (II) LoadModule: "ramdac" (II) Module "ramdac" already built-in (==) fglrx(0): NoAccel = NO (==) fglrx(0): NoDRI = NO (==) fglrx(0): Capabilities: 0x00000000 (==) fglrx(0): CapabilitiesEx: 0x00000000 (==) fglrx(0): OpenGL ClientDriverName: "fglrx_dri.so" (==) fglrx(0): UseFastTLS=0 (==) fglrx(0): BlockSignalsOnLock=1 (--) Depth 24 pixmap format is 32 bpp (II) Loading extension ATIFGLRXDRI (II) fglrx(0): doing swlDriScreenInit (II) fglrx(0): swlDriScreenInit for fglrx driver ukiDynamicMajor: found major device number 251 ukiDynamicMajor: found major device number 251 ukiDynamicMajor: found major device number 251 ukiOpenByBusid: Searching for BusID PCI:1:0:0 ukiOpenDevice: node name is /dev/ati/card0 ukiOpenDevice: open result is 17, (OK) ukiOpenByBusid: ukiOpenMinor returns 17 ukiOpenByBusid: ukiGetBusid reports PCI:2:0:0 ukiOpenDevice: node name is /dev/ati/card1 ukiOpenDevice: open result is 17, (OK) ukiOpenByBusid: ukiOpenMinor returns 17 ukiOpenByBusid: ukiGetBusid reports PCI:1:0:0 (II) fglrx(0): [uki] DRM interface version 1.0 (II) fglrx(0): [uki] created "fglrx" driver at busid "PCI:1:0:0" (II) fglrx(0): [uki] added 8192 byte SAREA at 0x2000 (II) fglrx(0): [uki] mapped SAREA 0x2000 to 0xb6996000 (II) fglrx(0): [uki] framebuffer handle = 0x3000 (II) fglrx(0): [uki] added 1 reserved context for kernel (II) fglrx(0): swlDriScreenInit done (II) fglrx(0): Kernel Module Version Information: (II) fglrx(0): Name: fglrx (II) fglrx(0): Version: 8.72.11 (II) fglrx(0): Date: Apr 8 2010 (II) fglrx(0): Desc: ATI FireGL DRM kernel module (II) fglrx(0): Kernel Module version matches driver. (II) fglrx(0): Kernel Module Build Time Information: (II) fglrx(0): Build-Kernel UTS_RELEASE: 2.6.32-22-generic-pae (II) fglrx(0): Build-Kernel MODVERSIONS: yes (II) fglrx(0): Build-Kernel __SMP__: yes (II) fglrx(0): Build-Kernel PAGE_SIZE: 0x1000 (II) fglrx(0): [uki] register handle = 0x00004000 (II) fglrx(0): DRI initialization successfull! (II) fglrx(0): FBADPhys: 0xf00000000 FBMappedSize: 0x01068000 (II) fglrx(0): FBMM initialized for area (0,0)-(1920,2240) (II) fglrx(0): FBMM auto alloc for area (0,0)-(1920,1920) (front color buffer - assumption) (II) fglrx(0): Largest offscreen area available: 1920 x 320 (==) fglrx(0): Backing store disabled (II) Loading extension FGLRXEXTENSION (==) fglrx(0): DPMS enabled (II) fglrx(0): Initialized in-driver Xinerama extension (**) fglrx(0): Textured Video is enabled. (II) LoadModule: "glesx" (II) Loading /usr/lib/xorg/extra-modules/modules/glesx.so (II) Module glesx: vendor="X.Org Foundation" compiled for 1.7.1, module version = 1.0.0 (II) Loading extension GLESX (II) Loading sub module "xaa" (II) LoadModule: "xaa" (II) Loading /usr/lib/xorg/modules/libxaa.so (II) Module xaa: vendor="X.Org Foundation" compiled for 1.7.6, module version = 1.2.1 ABI class: X.Org Video Driver, version 6.0 (II) fglrx(0): GLESX enableFlags = 94 (II) fglrx(0): Using XFree86 Acceleration Architecture (XAA) Screen to screen bit blits Solid filled rectangles Solid Horizontal and Vertical Lines Driver provided ScreenToScreenBitBlt replacement Driver provided FillSolidRects replacement (II) fglrx(0): GLESX is enabled (II) LoadModule: "amdxmm" (II) Loading /usr/lib/xorg/extra-modules/modules/amdxmm.so (II) Module amdxmm: vendor="X.Org Foundation" compiled for 1.7.1, module version = 1.0.0 (II) Loading extension AMDXVOPL (II) fglrx(0): UVD2 feature is available (II) fglrx(0): Enable composite support successfully (II) fglrx(0): X context handle = 0x1 (II) fglrx(0): [DRI] installation complete (==) fglrx(0): Silken mouse enabled (==) fglrx(0): Using HW cursor of display infrastructure! (II) fglrx(0): Disabling in-server RandR and enabling in-driver RandR 1.2. (--) RandR disabled (II) Found 2 VGA devices: arbiter wrapping enabled (II) Initializing built-in extension Generic Event Extension (II) Initializing built-in extension SHAPE (II) Initializing built-in extension MIT-SHM (II) Initializing built-in extension XInputExtension (II) Initializing built-in extension XTEST (II) Initializing built-in extension BIG-REQUESTS (II) Initializing built-in extension SYNC (II) Initializing built-in extension XKEYBOARD (II) Initializing built-in extension XC-MISC (II) Initializing built-in extension SECURITY (II) Initializing built-in extension XINERAMA (II) Initializing built-in extension XFIXES (II) Initializing built-in extension RENDER (II) Initializing built-in extension RANDR (II) Initializing built-in extension COMPOSITE (II) Initializing built-in extension DAMAGE ukiDynamicMajor: found major device number 251 ukiDynamicMajor: found major device number 251 ukiOpenByBusid: Searching for BusID PCI:1:0:0 ukiOpenDevice: node name is /dev/ati/card0 ukiOpenDevice: open result is 18, (OK) ukiOpenByBusid: ukiOpenMinor returns 18 ukiOpenByBusid: ukiGetBusid reports PCI:2:0:0 ukiOpenDevice: node name is /dev/ati/card1 ukiOpenDevice: open result is 18, (OK) ukiOpenByBusid: ukiOpenMinor returns 18 ukiOpenByBusid: ukiGetBusid reports PCI:1:0:0 (II) AIGLX: Loaded and initialized /usr/lib/dri/fglrx_dri.so (II) GLX: Initialized DRI GL provider for screen 0 (II) fglrx(0): Enable the clock gating! (II) fglrx(0): Setting screen physical size to 507 x 285 (II) XKB: reuse xkmfile /var/lib/xkb/server-B20D7FC79C7F597315E3E501AEF10E0D866E8E92.xkm (II) config/udev: Adding input device Power Button (/dev/input/event1) (**) Power Button: Applying InputClass "evdev keyboard catchall" (II) LoadModule: "evdev" (II) Loading /usr/lib/xorg/modules/input/evdev_drv.so (II) Module evdev: vendor="X.Org Foundation" compiled for 1.7.6, module version = 2.3.2 Module class: X.Org XInput Driver ABI class: X.Org XInput driver, version 7.0 (**) Power Button: always reports core events (**) Power Button: Device: "/dev/input/event1" (II) Power Button: Found keys (II) Power Button: Configuring as keyboard (II) XINPUT: Adding extended input device "Power Button" (type: KEYBOARD) (**) Option "xkb_rules" "evdev" (**) Option "xkb_model" "pc105" (**) Option "xkb_layout" "us" (II) config/udev: Adding input device Power Button (/dev/input/event0) (**) Power Button: Applying InputClass "evdev keyboard catchall" (**) Power Button: always reports core events (**) Power Button: Device: "/dev/input/event0" (II) Power Button: Found keys (II) Power Button: Configuring as keyboard (II) XINPUT: Adding extended input device "Power Button" (type: KEYBOARD) (**) Option "xkb_rules" "evdev" (**) Option "xkb_model" "pc105" (**) Option "xkb_layout" "us" (II) config/udev: Adding input device Logitech USB-PS/2 Optical Mouse (/dev/input/event3) (**) Logitech USB-PS/2 Optical Mouse: Applying InputClass "evdev pointer catchall" (**) Logitech USB-PS/2 Optical Mouse: always reports core events (**) Logitech USB-PS/2 Optical Mouse: Device: "/dev/input/event3" (II) Logitech USB-PS/2 Optical Mouse: Found 12 mouse buttons (II) Logitech USB-PS/2 Optical Mouse: Found scroll wheel(s) (II) Logitech USB-PS/2 Optical Mouse: Found relative axes (II) Logitech USB-PS/2 Optical Mouse: Found x and y relative axes (II) Logitech USB-PS/2 Optical Mouse: Configuring as mouse (**) Logitech USB-PS/2 Optical Mouse: YAxisMapping: buttons 4 and 5 (**) Logitech USB-PS/2 Optical Mouse: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200 (II) XINPUT: Adding extended input device "Logitech USB-PS/2 Optical Mouse" (type: MOUSE) (II) Logitech USB-PS/2 Optical Mouse: initialized for relative axes. (II) config/udev: Adding input device Logitech USB-PS/2 Optical Mouse (/dev/input/mouse1) (II) No input driver/identifier specified (ignoring) (II) config/udev: Adding input device Logitech USB Multimedia Keyboard (/dev/input/event4) (**) Logitech USB Multimedia Keyboard: Applying InputClass "evdev keyboard catchall" (**) Logitech USB Multimedia Keyboard: always reports core events (**) Logitech USB Multimedia Keyboard: Device: "/dev/input/event4" (II) Logitech USB Multimedia Keyboard: Found keys (II) Logitech USB Multimedia Keyboard: Configuring as keyboard (II) XINPUT: Adding extended input device "Logitech USB Multimedia Keyboard" (type: KEYBOARD) (**) Option "xkb_rules" "evdev" (**) Option "xkb_model" "pc105" (**) Option "xkb_layout" "us" (II) config/udev: Adding input device Logitech USB Multimedia Keyboard (/dev/input/event5) (**) Logitech USB Multimedia Keyboard: Applying InputClass "evdev keyboard catchall" (**) Logitech USB Multimedia Keyboard: always reports core events (**) Logitech USB Multimedia Keyboard: Device: "/dev/input/event5" (II) Logitech USB Multimedia Keyboard: Found keys (II) Logitech USB Multimedia Keyboard: Configuring as keyboard (II) XINPUT: Adding extended input device "Logitech USB Multimedia Keyboard" (type: KEYBOARD) (**) Option "xkb_rules" "evdev" (**) Option "xkb_model" "pc105" (**) Option "xkb_layout" "us" (II) config/udev: Adding input device KEYBOARD (/dev/input/event6) (**) KEYBOARD: Applying InputClass "evdev keyboard catchall" (**) KEYBOARD: always reports core events (**) KEYBOARD: Device: "/dev/input/event6" (II) KEYBOARD: Found keys (II) KEYBOARD: Configuring as keyboard (II) XINPUT: Adding extended input device "KEYBOARD" (type: KEYBOARD) (**) Option "xkb_rules" "evdev" (**) Option "xkb_model" "pc105" (**) Option "xkb_layout" "us" (II) config/udev: Adding input device KEYBOARD (/dev/input/event7) (**) KEYBOARD: Applying InputClass "evdev keyboard catchall" (**) KEYBOARD: always reports core events (**) KEYBOARD: Device: "/dev/input/event7" (II) KEYBOARD: Found 14 mouse buttons (II) KEYBOARD: Found scroll wheel(s) (II) KEYBOARD: Found relative axes (II) KEYBOARD: Found keys (II) KEYBOARD: Configuring as mouse (II) KEYBOARD: Configuring as keyboard (**) KEYBOARD: YAxisMapping: buttons 4 and 5 (**) KEYBOARD: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200 (II) XINPUT: Adding extended input device "KEYBOARD" (type: KEYBOARD) (**) Option "xkb_rules" "evdev" (**) Option "xkb_model" "pc105" (**) Option "xkb_layout" "us" (EE) KEYBOARD: failed to initialize for relative axes. (II) config/udev: Adding input device KEYBOARD (/dev/input/mouse2) (II) No input driver/identifier specified (ignoring) (II) config/udev: Adding input device Macintosh mouse button emulation (/dev/input/event2) (**) Macintosh mouse button emulation: Applying InputClass "evdev pointer catchall" (**) Macintosh mouse button emulation: always reports core events (**) Macintosh mouse button emulation: Device: "/dev/input/event2" (II) Macintosh mouse button emulation: Found 3 mouse buttons (II) Macintosh mouse button emulation: Found relative axes (II) Macintosh mouse button emulation: Found x and y relative axes (II) Macintosh mouse button emulation: Configuring as mouse (**) Macintosh mouse button emulation: YAxisMapping: buttons 4 and 5 (**) Macintosh mouse button emulation: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200 (II) XINPUT: Adding extended input device "Macintosh mouse button emulation" (type: MOUSE) (II) Macintosh mouse button emulation: initialized for relative axes. (II) config/udev: Adding input device Macintosh mouse button emulation (/dev/input/mouse0) (II) No input driver/identifier specified (ignoring) (II) fglrx(0): Restoring Recent Mode via PCS is not supported in RANDR 1.2 capable environments

    Read the article

  • How can I get the Terminal raster font to display alt codes in a text editor?

    - by grg-n-sox
    I am working on a project that includes making some ASCII art, except it isn't true ASCII art since I am using a far amount of Windows Alt codes to make it. Anyways, I wanted to make sure that as I am working on it, that it looks exactly how it will in a windows command prompt terminal session. So since command prompt defaults to the Terminal raster font, I figured I would use that. But I quickly noticed that when I use the Terminal typeface in a text editor, it will not render ASCII codes, either at all (as is the case most of the time) or incorrectly. Now, I understand if a font just doesn't support non-ASCII characters, but what I don't get is how the characters do show up correctly in command prompt when they don't in a text editor. I checked the output of the 'chcp' and it was set to 437 by default, which is what I need. Well, either that or 850 but preferably 437 since they got rid of some of the graphics in 437 and replaced them with other Latin characters. Command prompt terminal settings show I am using the Terminal raster font with a 8x12 glyph size. So I try using size 12 in the text editor but no good, even after switching the text encoding to either MS-DOS OEM-US (supposedly an alternative name for CP437) or UTF-8. I just don't get how I am not getting the characters to show up. Also, if it helps, the art I am making is basically modified screen shots from a game I play called Dwarf Fortress that uses characters from the Terminal/Curses typeset, or at least that is how it is reported in the forums by those who make graphics sets to replace the default character set. However, the game doesn't actually use the system's Terminal font. The game's data files includes a bitmap image that is a grid of all the characters the game uses. So it uses this bitmap to render graphics instead of the actual font file. And I basically want to get a text editor to make it so if I type up some ASCII art to look like a screenshot from Dwarf Fortress, that it will actually look like Dwarf Fortress other than the lack of color. Any help?

    Read the article

  • Nvidia GTS 360M in Asus laptop shuts off when gaming on external display

    - by mr odus
    I have had this problem on my Asus G51j Intel I5, Nvidia GTS360M graphics card. Just updated the drivers to latest off Nvidias site. Most games I play, If I hook it up through an external monitor(this happens whether it's HDMI or VGA). The laptop hard shuts down after about 20 minutes, give or take. This tends to happen on more graphically intense games like Call of dutys, bioshock. I'm running Windows 7, latest nvidia drivers. Games work fine on my laptops screen, and Movies, general computing work fine on the external displays. The laptop always sits on a cooling pad and the latest time was in front of my AC unit, I ran heatfan or whatever the heat tracking software was, and my temperatures were normal through a shut down. This has been happening for the life of the laptop. I dont play very many games, and even fewer on an external monitor, so this issue doesnt come up much Is it possible I have a faulty graphics card? Is there anything else I can try?

    Read the article

  • Computer restarts without warning; code bcc116

    - by Robert C.
    Processor: Intel i5 4430 4-Core 4x3Ghz Motherboard: msi h87-g41 Graphics Card: Nvidia GTX760 Power supply: eps-750 cm RAM: 8GB I bought a new assembled gaming PC which worked fine for a few days. Then it started rebooting without warning. After it restarts windows 7 gives me an bbc 116 error code. Apparently it's something to do with my video card, either it overheating or wrong drivers. I've installed the latest driver from Nvidia for my graphics card. Since it's brand new it can't be dust, I'm running it with its lid open to see if the problem persists. I'm also running prime95 now to see if it tells me anything else. Using core temp it tells me that my CPU reaches up to 95° celsius with the blend stress test from prime95. Aaaand it just peaked to 100°. Of course it doesn't reach these temperatures at all while idle/gaming. I'm gonna let prime95 run for a night and to see what happens. Until then does anyone know what I should do next?

    Read the article

  • calling concurrently Graphics.Draw and new Bitmap from memory in thread take long time

    - by Abdul jalil
    Example1 public partial class Form1 : Form { public Form1() { InitializeComponent(); pro = new Thread(new ThreadStart(Producer)); con = new Thread(new ThreadStart(Consumer)); } private AutoResetEvent m_DataAvailableEvent = new AutoResetEvent(false); Queue<Bitmap> queue = new Queue<Bitmap>(); Thread pro; Thread con ; public void Producer() { MemoryStream[] ms = new MemoryStream[3]; for (int y = 0; y < 3; y++) { StreamReader reader = new StreamReader("image"+(y+1)+".JPG"); BinaryReader breader = new BinaryReader(reader.BaseStream); byte[] buffer=new byte[reader.BaseStream.Length]; breader.Read(buffer,0,buffer.Length); ms[y] = new MemoryStream(buffer); } while (true) { for (int x = 0; x < 3; x++) { Bitmap bmp = new Bitmap(ms[x]); queue.Enqueue(bmp); m_DataAvailableEvent.Set(); Thread.Sleep(6); } } } public void Consumer() { Graphics g= pictureBox1.CreateGraphics(); while (true) { m_DataAvailableEvent.WaitOne(); Bitmap bmp = queue.Dequeue(); if (bmp != null) { // Bitmap bmp = new Bitmap(ms); g.DrawImage(bmp,new Point(0,0)); bmp.Dispose(); } } } private void pictureBox1_Click(object sender, EventArgs e) { con.Start(); pro.Start(); } } when Creating bitmap and Drawing to picture box are in seperate thread then Bitmap bmp = new Bitmap(ms[x]) take 45.591 millisecond and g.DrawImage(bmp,new Point(0,0)) take 41.430 milisecond when i make bitmap from memoryStream and draw it to picture box in one thread then Bitmap bmp = new Bitmap(ms[x]) take 29.619 and g.DrawImage(bmp,new Point(0,0)) take 35.540 the code is for Example 2 is why it take more time to draw and bitmap take time in seperate thread and how to reduce the time when processing in seperate thread. i am using ANTS performance profiler 4.3 public Form1() { InitializeComponent(); pro = new Thread(new ThreadStart(Producer)); con = new Thread(new ThreadStart(Consumer)); } private AutoResetEvent m_DataAvailableEvent = new AutoResetEvent(false); Queue<MemoryStream> queue = new Queue<MemoryStream>(); Thread pro; Thread con ; public void Producer() { MemoryStream[] ms = new MemoryStream[3]; for (int y = 0; y < 3; y++) { StreamReader reader = new StreamReader("image"+(y+1)+".JPG"); BinaryReader breader = new BinaryReader(reader.BaseStream); byte[] buffer=new byte[reader.BaseStream.Length]; breader.Read(buffer,0,buffer.Length); ms[y] = new MemoryStream(buffer); } while (true) { for (int x = 0; x < 3; x++) { // Bitmap bmp = new Bitmap(ms[x]); queue.Enqueue(ms[x]); m_DataAvailableEvent.Set(); Thread.Sleep(6); } } } public void Consumer() { Graphics g= pictureBox1.CreateGraphics(); while (true) { m_DataAvailableEvent.WaitOne(); //Bitmap bmp = queue.Dequeue(); MemoryStream ms= queue.Dequeue(); if (ms != null) { Bitmap bmp = new Bitmap(ms); g.DrawImage(bmp,new Point(0,0)); bmp.Dispose(); } } } private void pictureBox1_Click(object sender, EventArgs e) { con.Start(); pro.Start(); }

    Read the article

  • Reducing video mode switching during Linux boot

    - by Zack
    When I boot up my desktop computer, which only has Linux on it, the video mode and/or console font gets switched four times: When GRUB starts, it switches from 80x25 text to a graphical mode so it can draw a pretty background behind its menu; GRUB then goes back to 80x25 text after I pick something from the menu; When the KMS driver for my video card loads, it switches to a much higher-resolution text mode (I don't know if this is a hardware text mode or not); Finally X starts and it goes graphics and stays that way. I think this last switch does not change the resolution of the video mode, only the graphicalness. I'd like to get rid of as many of these mode switches as possible. Ideally, when GRUB takes over from the BIOS it would go directly to the same high-resolution text mode that the KMS driver selects, and the display would stay in that mode till X starts and brings up graphics. I am under the impression that this is possible by mucking with the kernel command line and/or the GRUB console module load parameters, but I don't know the details. GRUB 1.98+20100706, kernel 2.6.32.15 using Nouveau video drivers. Distro is Debian unstable. Please no answers that involve recompiling anything or cobbling together bleeding-edge kernel/driver combinations, I don't care enough about this to go to that much trouble. EDIT: Tobu suggests setting GRUB_GFXMODE to the full pixel resolution of the monitor, and GRUB_GFXPAYLOAD_LINUX=keep to avoid the mode switch after the menu goes away. This does part of what I want, but winds up being worse overall. There's no mode switch after the menu, but there's still a painfully-slow screen repaint (I should probably just give up on GRUB's gfxmode, it's waaaay too slow at 1920x1200). More seriously, there's now a double mode switch when nouveaufb loads, along with fun-looking error messages in dmesg [ 5.923798] [drm] nouveau 0000:02:00.0: allocated 1920x1200 fb: 0x40250000, bo ffff8801ba5f4600 [ 5.923802] fb: conflicting fb hw usage nouveaufb vs EFI VGA - removing generic driver [ 5.923821] [drm] nouveau 0000:02:00.0: PFIFO_INTR 0x00000010 - Ch 1 ("PFIFO_INTR" message repeats 400+ times) [ 5.925609] Console: switching to colour dummy device 80x25 [ 5.925802] Console: switching to colour frame buffer device 240x75

    Read the article

  • How I can fix the "Display Driver has stopped responding and has recovered"?

    - by Vitor Rangel
    I'm using a GeForce GTX 580, with Windows 7 64-bit. The driver version of the GTX is 301.42. The problem happens after a few minutes, when I'm playing specifc games. It won't happen in all games - And I don't have any idea why these games doesn't work. The games that doesn't work: Battlefield 3, Civilization V, Sniper Elite V2. The games that work: Mass Effect 3, Crysis 2, Team Fortress 2, Left 4 Dead 2, Skyrim, L.A. Noire. As you can see, it's not a problem of "The games that demand more stop working". I've tryed updating the driver of the graphics-card, the bios of the motherboard, even formated my computer (It was needing it) and instaled every driver in the last version possible. This problem happens since I bought my graphics-card, 6 months ago. After a few minutes, from 10 to 20, the pixels in the monitor become strange, with random colors and effects, like it was broken. Then, everything goes black, and the message appears "Display Driver has stopped responding and has recovered". After that, I need to close the game and start again. I am not overclocking, and my temperature never goes higher than 70ºC.

    Read the article

  • How to verify system using right GPU, after system reset [duplicate]

    - by Antoros
    This question already has an answer here: Is my mobile AMD card being used? 2 answers OS: Windows 8 CPU: Intel® Core™ i7 Processor 3635QM GPU 1 : Intel HD Graphics 4000 GPU 2 : AMD Radeon™ HD 8870M other info: System Spects Problem: im unsure that CCC is using AMD card instead of Intel's, i have encountered several issues since updating to 8.1 and i don't know what to do What happened: Installed 8.1 patch first day After 1 minute of use, BBSOD, windows never loaded again System restore wouldnt recognize 8.0 restore points i did a system reset to windows 8 since the laptop was only 3 weeks old System Broke, it did restore to factory BUT kept the registry almost intact, i had to install almost everything again, since the factory drivers where working with the updated one's registry and several problems CCC Broke too <- What i've already done Installing new drivers on top of old ones didnt work, so i used AMD uninstaller first Uninstalled and Re-installed Intel's HD Graphics Driver Tried to install mobile center, but AMD told me that it wasnt compatible (even if thats the only driver that they provide via their page as seen Here) Tried to use Auto-Detect, couldnt install driver because card was disabled because it didnt have the drivers... (see what they did here?) Had to use a workaround with Samsung Update, the driver didnt appear as download so had to use search and downloaded the driver manually. Now the graphic card appears on device manager and catalyst but as 8800 series (not exact model), and cant check the card with neither dxdiag/GPU-z/HWMonitor when right-clicking on CCC only Intel card appears launching a game and using as "high performance" would speed it up a little but i cant be sure How to verify its working properly? HWMonitor wont show AMD card even when set to high performance Latest GPU-Z wont work because a problem with Intel's, and legacy ones wont either what can I do now? I don't even know if I fixed my problem or not, and i also want to to use Adobe Premier with it, and its locked (the option to run it with the amd card not intels) Edit: now it seems to work, but cant change the setting for adobe Premiere and other programs that i Need to

    Read the article

  • Viewing Postscript (or PDF) on OS X: Aliasing issues

    - by mankoff
    I am generating postscript graphics and am trying to find a balance between non-aliasing and over-aliasing. If I use the raw ghostscript viewer gs on the Postscript, it looks good. The text appears anti-aliased, but the image remains nice and blocky. Unfortunately, gs has no real user interface and loses all of the nice things that Preview.app has. I could install gv, but the dependency bloat is huge! It requires all of gnome. And even that isn't a great viewer compared to Preview.app or Skim.app. Here is an image viewed with gs: From a user-interaction and Mac-ish perspective, Preview.app (or Skim.app is a much nicer program to use. They have the option to turn on or off aliasing, but neither option looks very good. Which aliasing on, the image is blurry. When it is off, the graphic matches what is seen from gs, but there are two issues. Minor issue: the font is ugly. Uglier than with gs. Major issue: Every PDF is un-aliased, making it hard to read regular PDFs full of text. So, in summary: Is there a way to manually generate the PDF from the PS that overcomes these issues? Is there a way to find a middle ground of alias/unalias with Preview.app? Is there another app that displays with quality like gs, but has a decent UI like Skim.app or Preview.app Is there a way to have Preview.app turn off aliasing for only one file (containing graphics) but leave it enabled in general so that text PDFs are still readable?

    Read the article

  • Windows 7 x64 support for Intel GMA 3650 (or GMA 3600)

    - by Loom
    I recently purchased an Intel D2700 MUD motherboard and I cannot find drivers for the Win7 x64 integrated graphics (Intel GMA 3650 aka PowerVR sgs545). The accompanying CD contains Win7 x32 version only. When I run it I got an error: This computer does not meet the minimum requirements for installing the software. I tried to use online utility Intel Driver Update Utility Graphics. I used Chrome, Firefox, Internet Explorer without success. First, UAC prompt appear, and then endlessly spinning progress-bar with text "Analyzing computer...". The text in UAC prompt is: Program file name: System Requirements Lab Verified publisher: Husdawg, LLC I downloaded this utility (intel_srldetect_4.5.5.0) and started it from my hard disk. I got an error: A network error occured while attempting to read from the file: C:\Users\Loom\Downloads\SystemRequirementsLab_intel_4.5.5.0.msi Standard VGA driver works for this video card but without hardware acceleration: Hardware acceleration is either disabled or not supported by your video card driver, which could slow game performance. Make sure you have the latest video card driver installed and that hardware acceleration is turned on. Where I can get appropriate driver?

    Read the article

  • Should I switch my graphics mode in the BIOS to avoid using Bumblebee?

    - by Fawkes5
    I have just purchased a Acer 3830TG, the timeline X series. To my surprise I found out that there is no first-party support for nvidia optimus for linux. Bumblebee works great, but the battery life from the graphics card always running is not so great. I don't use linux for games so i don't really need the graphics card on, I have Windows for that. In my bios, I have the ability to change my graphics mode from switchable to integrated. If I do this, reinstall ubuntu, what will happen? Will my nvidia card just turn off? Will everything work properly, as if i'm not running an optimus laptop? Is this recommended as opposed to dealing with bumblebee? What is the best thing I could do?

    Read the article

  • Why am I stuck at 640x480 on an Optimus hybrid-graphics system?

    - by exilada
    I have Intel HD 3000 graphics card onboard and nvidia 520 mx optimus techolonogy card. I was try to install Nvidia driver but it was failure. Now I cant use anything. Have one resolution 640x480 every media disconnected and I cant connect $ xrandr Screen 0: minimum 320 x 200, current 640 x 480, maximum 8192 x 8192 LVDS1 connected 640x480+0+0 (normal left inverted right x axis y axis) 344mm x 194mm 640x480 59.9* VGA1 disconnected (normal left inverted right x axis y axis) HDMI1 disconnected (normal left inverted right x axis y axis) DP1 disconnected (normal left inverted right x axis y axis) lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09 ) glxinfo | grep vendor server glx vendor string: SGI client glx vendor string: Mesa Project and SGI OpenGL vendor string: Tungsten Graphics, Inc Blockquote something wrong here I guess I try to some solutions but didnt work even it cant nvidia-xconfig file after these By the way system get eror sometimes about xorg Sorry for my English thaks for help.

    Read the article

  • How To Draw More Precise Lines using Core Graphics and CALayer

    - by user308444
    Hello I am having a hard time making this UI element look the way I want (see screenshot). Notice the image on the right--how the line width and darkness looks inconsistent compared to the image on the left (which happens to be a screen grab from safari) where the border width is more consistent. How does apple make their lines so perfect? I'm using a CALayer and the Core Graphics API to draw the image on the right. Is it possible to draw such perfect lines with the standard apis?

    Read the article

  • Qt Graphics Scene mouse event propagation

    - by Olorin
    hello i'm learning qt and i'm doing the folowing to add some widgets to a graphics scene void MainWindow::addWidgets(QList<QWidget *> &list, int code) { if(code == CODE_INFO) { QWidget *layoutWidget = new QWidget(); QVBoxLayout *layout = new QVBoxLayout(); foreach(QWidget *w, list) { layout->addWidget(w); this->connect(((ProductInfo*)w), SIGNAL(productClicked()), this, SLOT(getProductDetails())); } layoutWidget->setLayout(layout); this->scene->addWidget(layoutWidget); } } my ProductInfo class processes mouse release and emits a signal void ProductInfo::mouseReleaseEvent(QMouseEvent *e) { QWidget::mouseReleaseEvent(e); emit productClicked(); } the problem is after adding the widgets to the scene they no longer get the mouse release event and don't emit productClicked signal but if i add them to the main window(not to the scene) they work as expected. What am i doing wrong?

    Read the article

< Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >