Search Results

Search found 6703 results on 269 pages for 'amd graphics'.

Page 28/269 | < Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >

  • Cannot get ATI Drivers installed

    - by bittoast67
    I am trying to install the Catalyst driver. The best I can get is a strange resolution problem and firefox acts all wonkt. The worst I have gotten is low graphics mode in which I just reinstall Ubuntu. I have a HP Pavilion Dv7 laptop. With Radeon 3200 HD. I plan to try again with a fresh install of Ubuntu 12.4.3 as I have heard its the most compatible. This is what I have done: I have tried just the easy way of going to synaptic and installing the drivers that way. the fglrx package (not the fglrx update). And if memory serves I think that boots me into low graphics mode. So, fresh install of Ubuntu and tried again. I have done everything a couple times from this site (http://wiki.cchtml.com/index.php/Ubuntu_Precise_Installation_Guide) following every instruction to a T. That gets me something, such as a lowered fan speed and a much cooler computer, but I also lose most of my resolution. And displays says its the best resolution I can get. I also have a very screwy firefox. Using this method I can see AMD Catalyst Control Center in my dash (two of them really one administrator and one not) but when I try to open it it says no amd driver detected. So again, ubuntu reinstall. I have tried the GUI method from the Legacy driver I got from AMD's site. It runs through smoothly and at the very end after I exit the installer it gives me an error. I have also tried various other methods using terminal, as well as various different drivers (the one from the amd's site and the one suggested in the above link for my graphics card) both to no avail. When I try the method in the link on number 2, and I get the super low res and screwy fire fox. I type in, fglrxinfo ,and get a badrequest error. I have yet to type in fglrxinfo and get anything like what I am supposed to. UPDATE: I am now currently reinstalling Ubuntu 12.4. I tried the above mentioned link - thank you very much!- just to see on the previously failed driver attempt by following the purge commands. And to no avail when typing fglrxinfo I still get the badrequest thing. I will update again after a try with a true fresh install. Thanks again!! UPDATE: Alright everyone. Still no go. I have done everything word per word in the provided tutorial. I have rebooted my computer again to a fucked up resolution and this is what I get when typing fglrxinfo: $ fglrxinfo X Error of failed request: BadRequest (invalid request code or no such operation) Major opcode of failed request: 153 (GLX) Minor opcode of failed request: 19 (X_GLXQueryServerString) Serial number of failed request: 12 Current serial number in output stream: 12 I would like to add that when installing this file: fglrx_8.970-0ubuntu1_amd64.deb I got this: Building initial module for 3.8.0-29-generic Error! Bad return status for module build on kernel: 3.8.0-29-generic (x86_64) Consult /var/lib/dkms/fglrx/8.970/build/make.log for more information. update-initramfs: deferring update (trigger activated) Processing triggers for ureadahead ... Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for initramfs-tools ... update-initramfs: Generating /boot/initrd.img-3.8.0-29-generic Processing triggers for libc-bin ... ldconfig deferred processing now taking place Any ideas? Anyone? I cant for the life of me figure out what I am doing wrong.

    Read the article

  • Low Profile AMD Cooling

    - by J. T.
    I have a few setups where I can mount two motherboards on top of each other. They are running AMD Athlon 64 x2 dual core 4200 CPU's using a very low profile CPU cooler. These coolers are loud and annoying. Does anyone know of a low profile QUIET CPU cooling solution?

    Read the article

  • slow internet Problem on kubuntu 9.10 amd x64

    - by Abdelkrim-NET
    Hi everyone, i've recently installed kubuntu 9.10 amd x64, on an HP 6830s. The installation went smoothly, with only one problem: when I try to use the internet, it's incredibly slow. I couldn't install Firefox or anything that requires the internet. I have tried most of the solutions I found through Google, but none of them solves the problem. If anyone has an idea, I would appreciate the help.

    Read the article

  • AMD Opteron BSOD - A clock interrupt was not received on a secondary processor within the allocatio

    - by laurens
    On one of our servers we got an -for me new- BSOD with the error message: "A clock interrupt was not received on a secondary processor within the allocation time interval" The server's specs: HP XW9400 2x Dual-Core AMD Opeteron 2224 SE 3,20Ghz (4CPUs) 16GB HP ECC RAM Windows 2008 R2 Enterprise 64bit Nvidia Quadro 4xxx graphics (?) Sounds familiar to someone ? Thanks in advance

    Read the article

  • Low Profile AMD Cooling

    - by J. T.
    I have a few setups where I can mount two motherboards on top of each other. They are running AMD Athlon 64 x2 dual core 4200 CPU's using a very low profile CPU cooler. These coolers are loud and annoying. Does anyone know of a low profile QUIET CPU cooling solution?

    Read the article

  • Hyper-V Blue Screens with Nvidia GeForce 8400 GS Graphics Card

    - by Mahmoud Saleh
    I am using Windows Server 2008 R2 Enterprise x64. After installing the Hyper-V role and restarting the machine, I get a blue screen error and an immediate reboot. I have Googled the issue and tracked it down to the graphics card, so I uninstalled it, and then Windows loads fine. However, after installing the graphics driver again, the Blue Screen returns. The graphics card is an Nvidia GeForce 8400 GS. Does anyone know how I can resolve this issue?

    Read the article

  • Third monitor with AMD cards WITHOUT Eyefinity

    - by Resorath
    I have two AMD Radeon HD 6870 video cards in CrossFireX configuration and I would like to add a third monitor. I understand to use "Eyefinity" you need to use an active mini displayport to DVI adapter. I am not interested in the benefits of "Eyefinity", I just want a third monitor with Windows extended desktop. Is it possible to use either the HDMI head on the first card or the DVI heads from the second card to get a third monitor running without "Eyefinity" and an active adapter?

    Read the article

  • How do I keep from running out of memory on graphics for an Android app?

    - by user279112
    I've been working on an Android app in Eclipse, and so far, my program hasn't really grown past midget size. However I've already run into an issue with an Out of Memory error. You see, I've been using graphics comprised solely of bitmaps and PNGs in this program, and recently, when I tried to add a little bit more functionality to the program (mainly including a few more bitmaps and causing an extra sprite to be created), it started crashing in the graphics thread's constructor - sprite's constructor. When I tracked the problem down, it turned out to be an Out of Memory error that is seemingly caused by adding too many picture files to the program and creating Drawables out of them. This would be a problem, as I really don't have that many picture resources worked into that program...maybe 20 or so. I haven't even started to include sound yet. These images aren't all that fancy. My questions are this: 1) Are programs for the Android phone really that limited on how much memory they can employ, or is it probably something other than the 20-30 resource pictures causing that error? 2) If the memory for Android apps is so awful it can't even handle 20-30 picture resources being loaded into Drawables that exist at the same time, then how in the world are you supposed to make decent graphics and sound for that thing? Thanks.

    Read the article

  • AMD Fusion GPU passthrough to KVM or Xen

    - by BigChief
    Has anyone successfully gotten a passthrough working with the GPU portion of AMD's Fusion APUs (the E-350 is my target) on top of a Linux hypervisor? IE, I want to dedicate the GPU to one VM only, excluding all other VMs as well as the host. I know PCI passthrough can work with patches / kernel rebuilds for Xen and KVM. However, since the GPU is on the same chip, I don't know if the host OS will see it as PCI. I know there are a number of tangential issues here, such as: Poor Fusion drivers in Linux at the moment Unsuccessful patching efforts seem common VT-d / IOMMU is required and (from my reading) is supported on the APU, but the motherboard may not offer it KVM doesn't appear to support primary graphics cards, only secondary graphics cards (described here) However, I'd like to hear from anyone who has messed with this, even failed attempts. Fedora + KVM is my preferred virtualization platform but I'm willing to change that if it makes a difference. EDIT: The goal is to do this for a Windows 7 guest (I know it's asking a lot). Regardless, just assume this is HVM, not PV.

    Read the article

  • can't load IA 32-bit .dll on a AMD 64 bit platform

    - by user101425
    I have a Windows 2003 64 bit terminal server which we run a Java application from. The application has always worked up until 2 days ago. No new updates have been installed to the server in that time frame. I have tried re-installing java 64 bit but still have the following error. Unexpected exception: java.lang.reflect.InvocationTargetException java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at com.sun.javaws.Launcher.executeApplication(Unknown Source) at com.sun.javaws.Launcher.executeMainClass(Unknown Source) at com.sun.javaws.Launcher.doLaunchApp(Unknown Source) at com.sun.javaws.Launcher.run(Unknown Source) at java.lang.Thread.run(Unknown Source) **Caused by: java.lang.UnsatisfiedLinkError: C:\Documents and Settings\administrator\Application Data\Sun\Java\Deployment\cache\6.0\19\625835d3-5826d302-n\swt-win32-3116.dll: Can't load IA 32-bit .dll on a AMD 64-bit platform** at java.lang.ClassLoader$NativeLibrary.load(Native Method) at java.lang.ClassLoader.loadLibrary0(Unknown Source) at java.lang.ClassLoader.loadLibrary(Unknown Source) at java.lang.Runtime.loadLibrary0(Unknown Source) at java.lang.System.loadLibrary(Unknown Source) at org.eclipse.swt.internal.Library.loadLibrary(Library.java:100) at org.eclipse.swt.internal.win32.OS.<clinit>(OS.java:18) at org.eclipse.swt.graphics.Device.init(Device.java:563) at org.eclipse.swt.widgets.Display.init(Display.java:1784) at org.eclipse.swt.graphics.Device.<init>(Device.java:99) at org.eclipse.swt.widgets.Display.<init>(Display.java:363) at org.eclipse.swt.widgets.Display.<init>(Display.java:359) at com.ko.StartKO.main(StartKO.java:57) ... 9 more

    Read the article

  • Stable ubuntu distribution for a broadcom bcm4313 wireless driver and an nvidia 630 graphics adapter

    - by Vivek Pradhan
    I have been trying to completely shift to ubuntu or a linux distro for almost 2 years now. I have tried all the ubuntu distros starting from 10.04 through 11.10, but there has always been some bugs with the display drivers or the wireless cards not being recognised and the additional drivers suggested from the ubuntu community not doing the trick always. I tried a lot to fix bugs, checked a lot of forums, launch pad, got some of them fixed, but could not really get a neat and complete ubuntu machine set up till now. Now I really like the whole open source community and the linux platform, my pc at home runs ubuntu 11.04 perfectly but there have been some glitches always with ubuntu on my laptop that has forced me to stick with windows only. Now I am currently on an hp dv6 laptop that has the broadcom bcm4313 wireless driver and the nvidia 630gtm graphics(optimus) driver. Now I tried to do some research on the support of these drivers in linux machines but could not get anywhere. So I would really appreciate if you guys could suggest some linux distro that I could use that has full support of these drivers or has stable bug fixes for these kind of issues. I tried precise pangolin (LTS) through a live CD but i still see a problem with wifi networks which is a little frustrating. Please help me find the perfect match for my laptop :P I would gladly provide any other information necessary.

    Read the article

  • Mixing XNA and silverlight gives wierd graphics

    - by Mech0z
    I making a small 3dgame which is made as a Silverlight and XNA app, but when I draw the sprites the graphics becomes all wierd. All my primitive types are rendered correctly, but my 3d models are just wierd My Draw is like this when silverlight is set to draw private void OnDraw(object sender, GameTimerEventArgs e) { // Render the Silverlight controls using the UIElementRenderer elementRenderer.Render(); // Clear the screen to a solid color SharedGraphicsDeviceManager.Current.GraphicsDevice.Clear(Color.CornflowerBlue); switch (gameState) { case GameState.ChooseStarter: TextBlockStatus.Text = "Find Starting Player"; break; case GameState.PlaceBrick: TextBlockPlayer.Text = (playerTurn == PlayerTurn.PlayerOne) ? "Player One" : "Player Two"; TextBlockState.Text = "Place Brick"; foreach (IGraphicObject obj in _3dObjects) { obj.Draw(cameraPosition, e); } break; case GameState.GiveBrick: TextBlockState.Text = "Give Brick"; break; } spriteBatch.Begin(); // Using the texture from the UIElementRenderer, // draw the Silverlight controls to the screen spriteBatch.Draw(elementRenderer.Texture, cameraProjection, Color.White); spriteBatch.End(); } This gives me this output If I comment the spritebatch lines out I get the correct output, except the silverlight text is of course not shown I am not entirely sure what to look for except that zero vector I am giving to the spritebatch, but if thats the source I have no idea what I am supposed to set it as epspecially when its a 2d vector

    Read the article

  • Keeping game model and graphics/animation separate but in sync

    - by AJM
    Suppose I'm building a chess game where I want to have animations. Pieces glide to their new squares when moved. Pieces perform attack animations when capturing other pieces. I'm not sure how to effectively separate the data and logic needed for these animations and the actual game model (in the MVC sense). The pieces themselves should ideally not have to worry about their pixel coordinates or current animation frame. At the same time, many changes to the model are effectively driven by animations. A moved piece changes its position after (before?) its sprite is done gliding. A piece is removed from the board after the capturing piece is finished its attack animation. How would you suggest I manage the game model, the graphics and animations, and their relationships? For example, where would the animations "live"? How would animations be created and managed in response to player moves? How would animations drive updates to the game model, or how would the game model drive animations?

    Read the article

  • Designing generic render/graphics component in C++?

    - by s73v3r
    I'm trying to learn more about Component Entity systems. So I decided to write a Tetris clone. I'm using the "style" of component-entity system where the Entity is just a bag of Components, the Components are just data, a Node is a set of Components needed to accomplish something, and a System is a set of methods that operates on a Node. All of my components inherit from a basic IComponent interface. I'm trying to figure out how to design the Render/Graphics/Drawable Components. Originally, I was going to use SFML, and everything was going to be good. However, as this is an experimental system, I got the idea of being able to change out the render library at will. I thought that since the Rendering would be fairly componentized, this should be doable. However, I'm having problems figuring out how I would design a common Interface for the different types of Render Components. Should I be using C++ Template types? It seems that having the RenderComponent somehow return it's own mesh/sprite/whatever to the RenderSystem would be the simplest, but would be difficult to generalize. However, letting the RenderComponent just hold on to data about what it would render would make it hard to re-use this component for different renderable objects (background, falling piece, field of already fallen blocks, etc). I realize this is fairly over-engineered for a regular Tetris clone, but I'm trying to learn about component entity systems and making interchangeable components. It's just that rendering seems to be the hardest to split out for me.

    Read the article

  • NVIDIA Graphics - resolution problems with new 12.04 LTS installation

    - by Daveisuser56810
    I've been trying to install Ubuntu 12.04 LTS on my desktop most of the day. The desktop uses a NVIDIA GEFORCE 9800 (GT I think) graphics card. I am unable to set the correct resolution (1680 x 1050) for the display. The first problem I had was that of the "Black Screen" during install. I overcame this by utilising the "nomodeset" switch on the install options (once I'd found how to do that). The second problem of course was the "Black screen" following the first reboot. Once again this was overcome by using "nomodeset", this time by "editing" the GRUB. This gave me a resolution of 1280x768 which, the Displays GUI allowed me to change to 1280x720 (appears to fit on screen). I then tried to install the NVIDIA drivers. 1) using additional drivers 2) manually by downloading driver and installing in root As soon as NVIDIA drivers are installed - resolution become restricted to 640x480 (max). At this resolution Ubuntu GUI is not usable as most screens are larger than the display. Removing the NVIDIA driver and removing the XORG.CONF file does not lift this restriction. I have tried most things that I have found and that were vaguely intelligible, but nothing appears to get me closer to a resolution of 1680x1050. UPDATE: reinstalled Ubuntu 12-04 and used the "NoModeSet" in the Grub to restore the resolution to 1280x720, which is at least usable. Will live with this for now.

    Read the article

  • Restart and/or graphics problem in Ubuntu 12.04

    - by kara
    I having been using 12.04 for a couple of months now, with v. little problems. The other day I restarted my computer, and though I think it rebooted, the screen would be black. I could not even get a visual from a live cd. Finally, I was able to get it to load, but the resolution has been completely off. The computer thinks I have a laptop screen, when I actually have a ViewSonic VP2330wb, and it detects only two resolutions. And still, I have a problem with rebooting. If the screen locks after I leave it for a while, I can't get a visual back, and then when I force a shutdown, it takes 3 times for me to get a grub screen. Then I have to boot in recovery mode, and then finally in normal mode, but the screen is still always off. This is my video card: description: VGA compatible controller product: 2nd Generation Core Processor Family Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 09 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list configuration: latency=0 resources: memory:fe000000-fe3fffff memory:d0000000-dfffffff ioport:f000(size=64) I am a new ubuntu user, and am at my wits end. Any help would be greatly appreciated.

    Read the article

  • Hybrid Graphics on Ubuntu 12.04 switching to discrete

    - by cfstras
    I have a Sony Vaio VPCCB-27FX with hybrid graphics. Using vgaswitcheroo enables me to switch my discrete card off to save power. Now when i want to switch to the discrete card for performance, my system freezes. I already tried logging out and killing x with service lightdm stop, but still, it freezes as soon as I echo DIS > switch. typing blindly, echo IGD > switch returns me to my console where it reads [ 179.555171] i915: switched off, but it seems the discrete card never gets switched on... running echo DDIS > switch gives me the following: [540....] [drm:atop_op_jump] *ERROR* atombios stuck in loop for more than 5secs aborting [540....] [drm:atom_execute_table_locked] *ERROR* atombios stuck executing CEE2 (len 62, WS 0, PS 0) @ 0xCEFE [540....] [drm:atom_execute_table_locked] *ERROR* atombios stuck executing BBF6 (len 1036, WS 4, PS 0) @ 0xBCF3 [540....] [drm:atom_execute_table_locked] *ERROR* atombios stuck executing BB8C (len 76, WS 0, PS 0) @ 0xBB94 [541....] [drm:r600_RING_TEST] *ERROR* radeon: ring test failed (scratch(0x8504)=0xFFFFFFFF) [541....] [drm:evergreen_resume] *ERROR* evergreen startup failed on resume after that, the atombios part repeats a few times. also, the terminal locks up again and sysrq+REISUB is my only rescue. Has anybody an idea how I can switch to my discrete card without the system locking up? #uname -srvmpio Linux 3.2.0-24-generic #39-Ubuntu SMP Mon May 21 16:52:17 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux #lsb_release -r Description: Ubuntu 12.04 LTS

    Read the article

  • Ubuntu 14.04 install fails with Via S3 UniChrome Pro graphics

    - by WizardNo.7
    I am trying to install Ubuntu 14.04 on a Fujitsu Siemens Amilo Pro laptop(it's quite old yes, has about 30GB Hard Drive and I think 192mb of RAM) which currently has Windows XP installed (which I'd like to keep for the time being). I have downloaded the 32-bit Desktop ISO and used unetbootin to create a Live USB for this laptop. When I boot from USB, I arrive at the unetbootin Grey and Blue menu and pick either "Try Ubuntu without installing", or "Install Ubuntu". The menu goes away and an Ubuntu loadscreen showing UBUNTU and four dots which progressively change between white and orange. At about the second color changing cycle a white underscore symbol appears next to the fourth dot and flickers. There is some leftover text from the kernel boot still visible, but there is no graphical desktop. After this I have to hard reboot or shut-down. $ lshw -c video *-display UNCLAIMED description: VGA compatible controller product: CN700/P4M800 Pro/P4M800 CE/VN800 Graphics [S3 UniChrome Pro] Vendor: VIA Technologies, Inc. physical id: 0 bus info: pci@0000:01:00.0 version: 01 width: 32 bits clock: 66MHz capabilities: pm agp agp-2.0 vga_controller bus_master cap_list configuration: latency=64 mingnt=2 resources: memory:f0000000-f3ffffff memory:d1000000-d1ffffff

    Read the article

  • Rendering 8 bit graphics

    - by Matjaz Muhic
    I have a strong programming background just not from game development. I only made some pong and snake in high school and I did some OpenGL in college. I want to make my own game engine. Nothing fancy just a simple 2D game engine. But because I'm kinda old school and feeling retro. I want graphics to look like old 8 bit games (megaman, contra, super mario, ...). So how were the old games made back then? I want the simplest approach. Were they also using assets (images) like newer engines now do? How do you achieve this kind of rendering using OpenGL? Keep in mind. Simplest solution. I want to know how it was made back then and how I can replicate that. Doesn't even have to be OpenGL. I can draw on window canvas. I do want to make it from scratch basically.

    Read the article

  • fglrx-legacy-driver not seeing Radeon HD 4650 AGP

    - by Rocket Hazmat
    I am running Debian Squeeze on an old Dell Dimension 8300 box. It has an AGP Radeon HD 4650 card. I use this machine to mine bitcoins, and today I noticed that the machine had rebooted! My precious uptime! Anyway, my miner wouldn't start, so I figured might as well update my graphics driver, maybe that would fix the issue. I went to amd.com and downloaded the newest driver (12.6 legacy), but after installing it, aticonfig gave an error: aticonfig: No supported adapters detected I uninstalled the driver and figured I'd try to install it from apt. AMD has dropped support for the HD 4000 series in fglrx, forcing me to use fglrx-legacy-driver (currently only in experimental). In order to install this, I had to update libc6 (and some other important packages, like gcc), I had to use their wheezy versions. I finally got glrx-legacy-driver installed, but I still got: aticonfig: No supported adapters detected Why isn't the driver finding my video card? I have a hunch it has something to do with the fact that it's an AGP video card. Here is the output of lspci -v (why does it say Kernel driver in use: fglrx_pci?): 01:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI RV730 Pro AGP [Radeon HD 4600 Series] (prog-if 00 [VGA controller]) Subsystem: Advanced Micro Devices [AMD] nee ATI Device 0028 Flags: bus master, 66MHz, medium devsel, latency 64, IRQ 16 Memory at e0000000 (32-bit, prefetchable) [size=256M] I/O ports at de00 [size=256] Memory at fe9f0000 (32-bit, non-prefetchable) [size=64K] Expansion ROM at fea00000 [disabled] [size=128K] Capabilities: [50] Power Management version 3 Capabilities: [58] AGP version 3.0 Kernel driver in use: fglrx_pci

    Read the article

  • Fix for poor hd playback for 11.04 upwards

    - by mark kirby
    Hi guys ive seen loads of posts on this site about poor 720/1080p playback in recent ubuntu versions I had this problem and fixed it so I thought id share it with everyone.... 1 install mplayer 2 install smplayer frontend {in software center} 3 open smplayer 4 go to "OPTIONS" then "PREFRENCES" then "GENRAL" 5 if you have a nvidia card choose "OUTPUT DRIVER" and select "VDPAU" {for ATI or AMD choose xv (0 - ATI Radeon AVIVO video) I dont know if this will work as my card is nvidia but it should) 6 go to performance on the left hand side and set both local and streaming cache to 99999 (this may also fix dvd playback if you set that cache aswell} 7 check the box for "ALLOW HARD FRAME DROP" and set "LOOP FILTER" to skip only on HD 8 Set the "THREDS FOR DECODING OPTION TO THE NUMBER OF CORES YOUR CPU HAS IF YOU HAVE MORE THAN ONE CPU ADD UP ALL THE CORES FOR BEST PERFORMANCE" 9 Enjoy you HD movies again on ubuntu...... I have a pretty avrage machine heres my spec.... 2x Pentium 4 ht 3 ghz Stock dell power and motherboard GFORCE 310 HDMI 24 inch full HD tv as a monitor so any one with dule core cpu should have no problems getting this to work. hope this helps someone out.

    Read the article

  • My cpus are powered down periodically

    - by mgiammarco
    I post here because I am using Ubuntu but this is probably an hardware problem. Since I bought my new setup with AMD Athlon(tm) II X4 635 Processor and asus m4a89td pro/usb3 motherboard with ecc ram I have stuttering on videos. I was using ubuntu 11.10 now ubuntu 12.10. Looking at syslog I have found that periodically (I notice only on videos but it happens always) this thing happens: Mar 6 23:36:42 virtual1 kernel: [28564.375548] smpboot: CPU 1 is now offline Mar 6 23:36:42 virtual1 kernel: [28564.380751] smpboot: CPU 2 is now offline Mar 6 23:36:42 virtual1 kernel: [28564.394947] smpboot: CPU 3 is now offline Mar 6 23:36:48 virtual1 kernel: [28569.917021] smpboot: Booting Node 0 Processor 1 APIC 0x1 Mar 6 23:36:48 virtual1 kernel: [28569.928015] LVT offset 0 assigned for vector 0xf9 Mar 6 23:36:48 virtual1 kernel: [28569.928372] [Firmware Bug]: cpu 1, try to use APIC500 (LVT offset 0) for vector 0x400, but the register is already in use for vector 0xf9 on another cpu Mar 6 23:36:48 virtual1 kernel: [28569.928378] perf: IBS APIC setup failed on cpu #1 Mar 6 23:36:48 virtual1 kernel: [28569.931305] process: Switch to broadcast mode on CPU1 Mar 6 23:36:48 virtual1 kernel: [28569.934255] smpboot: Booting Node 0 Processor 2 APIC 0x2 Mar 6 23:36:48 virtual1 kernel: [28569.945554] [Firmware Bug]: cpu 2, try to use APIC500 (LVT offset 0) for vector 0x400, but the register is already in use for vector 0xf9 on another cpu Mar 6 23:36:48 virtual1 kernel: [28569.945558] perf: IBS APIC setup failed on cpu #2 Mar 6 23:36:48 virtual1 kernel: [28569.948124] process: Switch to broadcast mode on CPU2 Mar 6 23:36:48 virtual1 kernel: [28569.949644] smpboot: Booting Node 0 Processor 3 APIC 0x3 Mar 6 23:36:48 virtual1 kernel: [28569.960838] [Firmware Bug]: cpu 3, try to use APIC500 (LVT offset 0) for vector 0x400, but the register is already in use for vector 0xf9 on another cpu Mar 6 23:36:48 virtual1 kernel: [28569.960840] perf: IBS APIC setup failed on cpu #3 Mar 6 23:36:48 virtual1 kernel: [28569.962953] process: Switch to broadcast mode on CPU3 I have: updated bios; tried all (really) bios options; changed ram; changed psu and cpu cooler; tried 3.8.1 kernel. What can I do now? Please help me! Thanks, Mario

    Read the article

  • How can I switch between the HDMI and DVI outputs of my graphics card?

    - by Owen Melbourne
    I've got an Nvidia GTX 560ti card which currently I've got my 2 monitors hooked up to using the 2 DVI ports. However its got a mini-HDMI port which I've plugged a HDMI cable in (with mini adapter) and lead it into my TV which is across the room. I'm hoping to be able to toggle between the HDMI output and the DVI outputs, however I'm not sure how I'd go about this, Could somebody please point me in the right direction, I'm not really worried about having all 3 on at the same time so that isn't a problem, but if its possible then I'll do that.

    Read the article

  • Can I get dual monitor support (2xDVI) from my (DVI+HDMI) graphics card?

    - by nray
    It seems that pure 2 x DVI dual head video cards are becoming rare. Most cards feature something like 1 x DVI plus 1 x HDMI plus 1 x VGA or some other interface. The idea seems to be that you can just use an HDMI <= DVI adapter. One result is that cards are seldom marked " 2 x DVI " anymore, but does this mean that they support simultaneous output on all interfaces? Are all cards dual head these days? Take Asus's nVidia cards for example, they routinely have 1 x DVI plus 1 x HDMI instead of 2 x DVI, so my question is, are these equivalent to a dual head DVI card, or is there some detail required for dual monitor support? I use dual-monitor stretched desktops for digital signage projects.

    Read the article

< Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >