Search Results

Search found 17287 results on 692 pages for 'card game'.

Page 212/692 | < Previous Page | 208 209 210 211 212 213 214 215 216 217 218 219  | Next Page >

  • Diagnosing PCI issues

    - by dtsazza
    I'm upgrading a PC for a friend, and have run into a problem with upgrading the motherboard. I've been assembling custom PCs for the best part of a decade now, so I'm happy enough with the basics at the very least. The motherboard, CPU and graphics card were all updated at once - after this was done, the machine POSTs but the PCI wireless card, as well as the PCI-E graphics card, do not seem to be recognised at all by the system. No trace of them anywhere in the BIOS, or the POST output, or in Windows. I booted into Linux and ran an lspci which also showed up no sign of them. What is the best step to go about diagnosing this? Is it likely/feasible that the motherboard's PCI bus is just defective and it needs to be RMAed? Are there any other common gotchas that might cause these symptoms? For reference, the components in question are: CPU: Celeron E1400 Motherboard: Gigabyte GA-G31M-ES2L Graphics card: TBC (a low end card from a couple of years ago; worked flawlessly before the mobo change) PCI WNIC: Edimax 7128G Thanks in advance for any help.

    Read the article

  • Why my new GTX 660m's clock drops drastically after running few seconds

    - by trVoldemort
    I bought a Lenovo Y580 laptop few days ago, this model is equipped with GTX 660m graphics card. However, the game performance is unbelievably poor since it out from the box. I realized there is something wrong with this graphics card. I downloaded GPU-z, and did a simple test. And I was shocked by the fact that my GTX 660m graphics card is running at 135.0mhz core clock. (It should be 835mhz at least!) Even the integrated graphics card "Intel HD graphics 4000" can run at 650mhz. Further examining showed that in the first few seconds GTX 660m was actually running at 835mhz, however the core temperature quickly reached 90+°C and the clock (maybe) automatically drop to 135.0mhz. This is very strange. Anyone has any idea what's going on here?

    Read the article

  • Regex: Getting content from url

    - by farazshuja
    i want to get "the-game" using regex from urls like http ://www.somesite.com.domain.webdev.domain.com/en/the-game/another-one/another-one/another-one/ http ://www.somesite.com.domain.webdev.domain.com/en/the-game/another-one/another-one/ http ://www.somesite.com.domain.webdev.domain.com/en/the-game/another-one/ Just created space after http, as its not allowing me to post more links

    Read the article

  • Remove part of the URL with .htaccess

    - by Gabriel Bianconi
    Hello. I've changed some settings in my website, and now I need to redirect from: www.plugb.com/home/game/a www.plugb.com/home/something/else www.plugb.com/home/game/b ... to www.plugb.com/game/a www.plugb.com/something/else www.plugb.com/game/b ... I don't know how to do this with .htaccess. BTW, I'm using CodeIgniter. Thanks in advance.

    Read the article

  • Incorrect logic flow? function that gets coordinates for a sudoku game

    - by igor
    This function of mine keeps on failing an autograder, I am trying to figure out if there is a problem with its logic flow? Any thoughts? Basically, if the row is wrong, "invalid row" should be printed, and clearInput(); called, and return false. When y is wrong, "invalid column" printed, and clearInput(); called and return false. When both are wrong, only "invalid row" is to be printed (and still clearInput and return false. Obviously when row and y are correct, print no error and return true. My function gets through most of the test cases, but fails towards the end, I'm a little lost as to why. bool getCoords(int & x, int & y) { char row; bool noError=true; cin>>row>>y; row=toupper(row); if(row>='A' && row<='I' && isalpha(row) && y>=1 && y<=9) { x=row-'A'; y=y-1; return true; } else if(!(row>='A' && row<='I')) { cout<<"Invalid row"<<endl; noError=false; clearInput(); return false; } else { if(noError) { cout<<"Invalid column"<<endl; } clearInput(); return false; } }

    Read the article

  • how to enable opengl 2.0 and webgl on gma 3150 on ubuntu?

    - by mahmoudelbadry
    hi, i have a dell mini 1012 which an intel n450 processor and gma 3150 integrated graphics card running ubuntu 10.10 according to the intel website the graphics card support opengl 2.0 http://software.intel.com/en-us/arti...ed-graphics/#9 but when i type glxinfo in terminal the opengl version string gives me the following OpenGL version string: 1.4 Mesa 7.9-devel i installed the latest drivers but it didn't work so, how can i enable opengl 2.0 on this card?? thanks

    Read the article

  • How to determine the root cause of a system lockup on Ubuntu 8.04 LTS?

    - by jdt141
    I'm currently working a project that involves setting up a PC/104 stack and running Ubuntu 8.04 LTS. We need to use the PC/104 stack because its an embedded application - and we're required to use a DeviceNet peripheral card to communicate to other devices. (DeviceNet is just a protocol on top of CAN.) Anyway, the following hardware is on the stack: Kontron MOPSPM104 with a 1GHz Intel Celeron processor ConnectTech FlashDrive/104 4GB Industrial Temp (-40 to +85 C) Woodhead (Molex) PC104DVNIO DeviceNet card A run of the mill 104 power supply The Kontron Board offers two serial ports, one VGA out, and two USB ports. The DeviceNet card is an ISA card. Because of this (per the User's Guide for the Kontron Board), I have manually set the IRQs in the BIOS to be appropriately configured, and turned off ACPI in both the BIOS and passed the appropriate flag in GRUB. I've installed Ubuntu 8.04 desktop, 32 bit. The problem that I'm having is that, from time to time, the entire 104 stack locks up. This only seems to happen in two cases, both of which we're running GNOME. We have a custom application that uses the DeviceNet card, and the system will lock up, or (more frequently) when we're running Firefox and either surfing for some information or trying to test it - typically by streaming video from a IP-camera. The reason I ask this questions is I cannot determine the root cause of this lockup. The IRQs appear to correctly configured in the BIOS and as the Kernel sees them, and nothing is logged to dmesg. If you all could help me determine the root cause of this lockup, I would greatly appreciate it. Thanks.

    Read the article

  • Windows XP does not list WPA wireless networks

    - by Tomalak
    What can be the reason that Windows XP does not show WPA-encrypted wireless networks? The laptop I have problems with is an older model (Toshiba Satellite Pro 6100) with Windows XP SP3 on it, fresh install. The wireless network card in it is an Agere product that lists as "Toshiba Wireless LAN Mini PCI Card". The networks showed up perfectly before I first tried to connect to one (it was set to WPA2). The connection failed (the card supports WPA only), then something must have happend and Windows hides these networks now. A manually configured WPA setup via Windows' own wizard works, I'm using it right now. The network just won't show up in the list of available network on its own. I suspect that XP incorrectly set a flag somewhere that this network card does not support WPA. Is there such a flag, and if so, how can I change it back?

    Read the article

  • Is it possible to dedicate the physical screen of a vmware server machine to a guest vm graphically?

    - by matnagel
    I have a vmware server 2.x running on ubuntu server (8.04). So the graphics card and the screen of the physical box are unused (I log in remotely and the host os has only the cli console installed). I wonder if it is possible to assign this graphics card to a virtual machine directly and use it for the gui of this guest? Or maybe if I add a second graphics card to the machine?

    Read the article

  • Is it safe to have NVidia graphics always on on a Linux laptop, or do I risk overheating?

    - by codeape
    I'm getting a Lenovo T520 with two graphics cards: Integrated Intel HD 3000 Discrete NVidia NVS 4200M In BIOS, I can adjust which card(s) to use: Integrated only Discrete only Both (NVidia optimus) Since optimus is not well supported under Linux, I wonder if it is OK to set up the system to use the NVidia card all the time. I have read somewhere that a laptop risks overheating if using a discrete graphics card all the time. Is this true? Does someone have any experience to share?

    Read the article

  • Building My First Computer And Suprise It Isn't Working

    - by BobbShots
    I've had many years of experience working on and around computers, but this was my first foray into building one completely from scratch. So far that foray has been a disaster. My rig is completely assembled, and on its maiden power-up plus many power cycles I noticed three things: There were a few beeps from the BIOS POST upon powering up the first time, but I wasn't paying attention completely to the sequence. However, every time after that there are 0 POST beeps, even after taking off all hardware except the CPU and MB. There was no video being sent to the monitor. I run a HDMI cable from my video card to the monitor. The video card was LOUD. My card is a Sapphire Radeon HD 5870 which is known for not only being a powerhouse, but being pretty quiet. A few times during my power cycles it ran a lot quieter, but most of the time it was just super loud. Can anyone provide help for any of these issues? My MB, CPU, and Video Card are: MB: ASUS P6X58D Premium LGA 1366 Intel X58 SATA 6Gb/s USB 3.0 ATX Intel Motherboard CPU: i7 920 Video Card: Sapphire Radeon HD 5870

    Read the article

  • Does a hidden UIViewController consume any resources (iPhone)?

    - by MrDatabase
    My simple iPhone game has two basic "screens": home screen (UIViewController subclass) game screen (UIWindow w/ EAGLLayer where all the OpenGL drawing happens) Currently when the user taps "Play" on the homescreen the UIViewController is just hidden and the game screen is revealed. When the game is over the homescreen UIViewController is unhidden. Does the hidden UIViewController consume any resources when it's hidden?

    Read the article

  • I'm using a compatible active DisplayPort to DVI adapter with EyeFinity, why does my monitor still flicker?

    - by Christopher Galpin
    I specifically chose an active DisplayPort <- DVI adapter for use with EyeFinity right out of my graphic card vendor's list of confirmed compatible adapters. Yet the screen fails horribly, it blinks on and off constantly, sometimes the graphics go screwy and the appropriate resolutions won't be available. Sometimes the resolution will be available but I'll discover it's only with interlaced refresh rates and bounces up and down. I have to switch the resolution back and forth, again and again, to get it to work correctly, and then it fails again and the process must be repeated the moment the monitor is turned off or I reboot. It's maddening. What is wrong? Is my GFX card supplying insufficient voltage? (Firmware tweaks allegedly help some people, but my card's isn't modifiable.) Could the adapter be defective? Is it not "active" enough for my card and I need an expensive powered adapter? Is this endemic to DisplayPort in general?

    Read the article

  • Windows 7: from Geforce 8800 to three monitors?

    - by lance
    I've got a GeForce 8800 that I've quite happy with. It drives my two 23" widescreen displays well. Now I've got a 19" standard display that I want to stick between the two widescreens. My second PCIe 16x slot is unused (as is the PCI slot below that), and I want to add a card to my Win7 x64 system. This 19" display won't be used for gaming, so I don't need anything fancy. Here are two cards I was considering, but I'm wondering if they're bad choices for some reason? If they're both fine choices, which is better and why? Again, I'm needing to power only the 19" standard display with this card, and it won't play games. I just need 1280x1024 in Win7 x64. NVidia: Galaxy 95TFE8HUFEXX GeForce 9500 GT Video Card - 512MB DDR2, PCI Express 2.0 ATI: ASUS EAH4350 SILENT/DI/51 Radeon HD 4350 Video Card - 512MB DDR2, PCI Express 2.0

    Read the article

  • How can I get multiple video cards to work on linux?

    - by user17943
    I installed fedora 12. I have 2 ATI cards that I used to use on windows to run 4 monitors. A recurring problem has been to get them detected in linux. Only my secondary card is picked up linux. When I manage the displays it detects the 2 monitors connected that card. What are the specific steps I should take to get the second card detected? Supposedly there is a tool system-config-xfree. I don't have it, yum can't find it. Also I heard it has something to do with editing some xorg.conf file or something to that effect. I have absolutely no idea how to find the "bus id" of my card, or lookup the horizontal refresh rates, etc.. I would probably have no problem following the documentation & editing the file if I knew a good way to find these values. Someone also suggested installing linux twice and saving the xorg.conf it generates each time (with different card each time) and then merging the two by hand. That is like killing a fly with a hammer though, when I do this again and again in the future It'd be nice to not have to take twice as long. Thanks

    Read the article

  • Why is the pavucontrol level indicator jumping while nothing plays?

    - by EnterTheLiquidToasterFamily
    The level indicator in the screenshot does jump around even if nothing is playing. The indicator also reasonably represents sound levels when music is playing. I dont have any mediaservers running or noisy browsertabs open. Also no mic connected. When I turn the volume to max in software and on the amp, there is no noise from the speakers at all. Played music is loud and not distorted. Hardware: Realtek ALC889 over optical audio connector to a generic amp. Software: Debian Wheezy with latest backport kernel 3.14 (same thing on wheezy 3.2 stock), wheezy pulseaudio, xfce session, a custom asound.conf that enables pulseaudio to push sound over optical port. /etc/asound.conf pcm.a52 { @args [CARD] @args.CARD { type string } type rate slave { pcm { type a52 bitrate 448 channels 6 card $CARD } rate 48000 #required somehow, otherwise nothing happens in PulseAudio } }

    Read the article

  • Django model help

    - by dotty
    Does anyone have any clue why this doesn't work as expected. If i use the python shell and do team.game_set or team.games It returns an error AttributeError: 'Team' object has no attribute 'game' If i create a Game object and call game.home_team it returns the correct team object Heres my model class Team(models.Model): name = models.CharField(blank=True, max_length=100) class Game(models.Model): home_team = models.ForeignKey(Team, related_name="home_team")

    Read the article

  • How to import a package from Eclipse?

    - by Roman
    In one of my directories I have all .java files which belong to one package ("game"). Now I want to create one .java file which does not belong to this package and which imports the "game" package. If I create a new file and write import game; then Eclipse complains that it does not know what the "game" package means. Can somebody please help me to solve this problem?

    Read the article

  • Failover Internet connection?

    - by ez_brian0
    Hi In my Linux server i have three network cards. The eth0 card is connected to ISP1, the eth1 card is connected to LAN and the eth3 card is connected to ISP2. What i want to do is to automatically use eth3 as Internet connection if the eth0 connection fails. How can this be done? Another problem is that my firewall is referring to eth0 as the server is doing NAT for clients this would be broken if eth3 is taken in use, how can this be solved?

    Read the article

  • automatically minimized chat tabs

    - by Xiang Ubao
    now I have a very urgent matter, ask for your help.Because we must call fu.init in hideFlashCallback to pause our game, When the user is in the another page to open chat tabs to our game, the game will pause automatically. Is there a method like FB API or something at the beginning of the loading game, automatically minimized chat tabs, so as to solve our problems.Hope you reply to us, help us to solve the problem. Thanks a million.

    Read the article

  • Run 3 monitors on two different video cards?

    - by hullot
    Can I run 3 monitors on two different video cards? I have an ATI and Nvidia brand card. The ATI has 2 HDMI connections. They both work. Both cards are also picked up in Windows, one being the ATI and the other one as the Nvidia, but it says VGA Controller, although the card only takes 2 DVi. So, one DVI cable goes into that Nvidia card. 3 Monitors, but only 2 the HDMI ones from the ATI pick up, not the third one which is connected to the Nvidia via DVI. How can I run three monitors then? I suppose I can't install both drivers, so I'm unsure what to do. Is this possible? I just want the Nvidia card to power the third screen, no gaming on it, nothing. Also the ATI is picked up as primary card as well, so no hurdle there. EDIT: Hm, just installed the Nvidia drivers and it picked up the third screen no problem. Hope there aren't any major conflicts. Will post this as an answer as correct when I'm able. Can't as a new user.

    Read the article

  • Writing to an xml file with xmllite?

    - by Chris
    I have an xml file which holds a set of "game" nodes (which contain details about saved gameplay, as you'd save your game on any console game). All of this is contained within a "games" root node. I'm implementing save functionality to this xml file and wish to be able to append or overwrite a "game" node and its child nodes within the "games" root node. How can this be accomplished with xmllite.dll?

    Read the article

  • Why is the application not starting from top...

    - by user536213
    i have created iphone game.When i pause the game using pause button i quit cliking quit button... Now when i start the game again ..the previous counter i created using this code... [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:@selector(updateTimerFunc) userInfo:nil repeats:YES]; the timer is 100 sec and moves to zero...now it start giving the difference of twwo ,,98,96,94 if i quit the game again ans start this time the difference will become of 4 96,92 ...its keep on increasing ....what is this issue? kindly help

    Read the article

< Previous Page | 208 209 210 211 212 213 214 215 216 217 218 219  | Next Page >