Search Results

Search found 1584 results on 64 pages for 'punch cards'.

Page 5/64 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Multiple network cards, controlling where my traffic goes

    - by thefinn93
    This is an Ubuntu 12.04 server install. I have multiple network cards, eth0 and eth1 lets call them. eth0 is connected to the internet, and all of my traffic goes through it, until eth1 gets plugged in. Then the machine tries to send everything through eth1, which for various and sundry reasons does not go out to the Interent. The only traffic it doesn't send through eth1 is traffic on eth0's subnet. It also will not accept inbound connections on eth0 from outside of eth0's subnet. I'd like all outbound traffic to go out eth0, but I'd like incoming connections from to either card from any subnet to work.

    Read the article

  • Dual 9500GT video cards, one works

    - by bossDub
    I have two video cards, both are PCIe 512MB Nvidia 9500GTs. One is an Evga and the other is another brand. If I put them both in, the one in the top slot will work (either card). The one in the lower slot says Code 12, not enough resources to use this device. I've tried disabling Firewire and a few other things in an attempt to get more "resources". Intel DP45SG motherboard. 4G Ram. 64bit Vista. Q8200 CPU. I do not have the SLI connector.

    Read the article

  • Dual NVidia graphics cards in Ubuntu / xorg.conf mania

    - by John Zwinck
    I have two NVidia graphics cards: Quadro NVS 295 (PCI Express, dual DisplayPort outputs) GeForce FX 5200 (PCI, DVI and VGA outputs) I have three identical monitors, two on DisplayPort and one on DVI. I'm on Ubuntu Hardy (and cannot currently dist-upgrade for separate reasons). I use the "nvidia" driver. What's new is the GeForce card and the third monitor. I currently have the dual DisplayPort monitors working fine. Here are the display-related parts of my xorg.conf: Section "ServerLayout" Identifier "Default Layout" Screen "PCI-Express Screen" 0 0 # adding this makes X fail to start: Screen "PCI Screen" 0 Inputdevice "Generic Keyboard" Inputdevice "Configured Mouse" EndSection Section "Module" Load "glx" # not sure why/if this is needed EndSection Section "Monitor" Identifier "DELL 2408WFP" Option "DPMS" EndSection Section "Device" Identifier "NVIDIA Quadro NVS 295" Driver "nvidia" Option "RenderAccel" "true" Screen 0 BusID "PCI:2:0:0" EndSection Section "Device" Identifier "NVIDIA GeForce FX 5200" Driver "nvidia" Option "RenderAccel" "true" Screen 1 BusID "PCI:6:4:0" EndSection Section "Screen" Identifier "PCI-Express Screen" Device "NVIDIA Quadro NVS 295" Monitor "DELL 2408WFP" Defaultdepth 24 Option "TwinView" "True" Option "UseEdidFreqs" "True" Option "MetaModes" "1920x1200 +0+1200, 1920x1200 +0+0" EndSection Section "Screen" Identifier "PCI Screen" Device "NVIDIA GeForce FX 5200" Monitor "DELL 2408WFP" Defaultdepth 24 Option "TwinView" "True" Option "UseEdidFreqs" "True" Option "MetaModes" "1920x1200 +0+0" EndSection I use nvidia-settings to configure my monitors, and it does not show the second GPU. lspci, though, shows: 02:00.0 VGA compatible controller: nVidia Corporation Unknown device 06fd 06:04.0 VGA compatible controller: nVidia Corporation NV34 [GeForce FX 5200] Which is where I got the BusID settings for the two devices (when I just had one device, I didn't have any BusID listed...and adding the BusID hasn't broken anything). What am I missing? How can I make nvidia-settings show my second GPU so I can then configure its monitor?

    Read the article

  • Configuring three monitors with two Radeon X1600/X1650 graphics cards under Ubuntu

    - by cpm
    I have three SyncMaster 932a monitors I want to use with two Radeon X1600/X1650 cards under Linux. I am running X.org X Server 1.6.0, as provided by Ubuntu's Wubi installer. After turning off mirroring, I ended up with this xorg.conf: Section "Monitor" Identifier "Configured Monitor" EndSection Section "Screen" Identifier "Default Screen" Monitor "Configured Monitor" Device "Configured Video Device" SubSection "Display" Virtual 2560 1024 EndSubSection EndSection Section "Device" Identifier "Configured Video Device" EndSection The left monitor had a menu bar and a task bar, the center monitor was just desktop, and windows would maximize to the current monitor. The third monitor and second graphics card weren't being used at all. Then I changed my configuration to manually specify each card with their PCI bus: Section "ServerLayout" Identifier "TheLayout" Screen 0 "Radeon Screen 1" Screen 1 "Radeon Screen 2" RightOf "Radeon Screen 1" EndSection Section "Screen" Identifier "Radeon Screen 1" Monitor "Configured Monitor" Device "Radeon the First" SubSection "Display" Virtual 2560 1024 EndSubSection EndSection Section "Screen" Identifier "Radeon Screen 2" Monitor "Configured Monitor" Device "Radeon the Second" EndSection Section "Device" Identifier "Radeon the First" Driver "radeon" BusID "PCI:1:0:0" EndSection Section "Device" Identifier "Radeon the Second" Driver "radeon" BusID "PCI:2:0:0" EndSection Section "Monitor" Identifier "Configured Monitor" EndSection Now both the left and right monitors have task bars and menu bars. Windows cannot be dragged from the first two monitors to the third monitor. Also, maximizing in the left or center window fills both monitors. I also tried adding Option "Xinerama" "true" to the ServerLayout section. X11 wasn't able to start up. I want to: Allow moving windows along all three monitors. Maximizing only fills the current monitor. Either have menu/task bars on only the left monitor or all three monitors How can I make this possible?

    Read the article

  • Configuring three monitors with two Radeon X1600/X1650 graphics cards under Ubuntu

    - by cpm
    I have three SyncMaster 932a monitors I want to use with two Radeon X1600/X1650 cards under Linux. I am running X.org X Server 1.6.0, as provided by Ubuntu's Wubi installer. After turning off mirroring, I ended up with this xorg.conf: Section "Monitor" Identifier "Configured Monitor" EndSection Section "Screen" Identifier "Default Screen" Monitor "Configured Monitor" Device "Configured Video Device" SubSection "Display" Virtual 2560 1024 EndSubSection EndSection Section "Device" Identifier "Configured Video Device" EndSection The left monitor had a menu bar and a task bar, the center monitor was just desktop, and windows would maximize to the current monitor. The third monitor and second graphics card weren't being used at all. Then I changed my configuration to manually specify each card with their PCI bus: Section "ServerLayout" Identifier "TheLayout" Screen 0 "Radeon Screen 1" Screen 1 "Radeon Screen 2" RightOf "Radeon Screen 1" EndSection Section "Screen" Identifier "Radeon Screen 1" Monitor "Configured Monitor" Device "Radeon the First" SubSection "Display" Virtual 2560 1024 EndSubSection EndSection Section "Screen" Identifier "Radeon Screen 2" Monitor "Configured Monitor" Device "Radeon the Second" EndSection Section "Device" Identifier "Radeon the First" Driver "radeon" BusID "PCI:1:0:0" EndSection Section "Device" Identifier "Radeon the Second" Driver "radeon" BusID "PCI:2:0:0" EndSection Section "Monitor" Identifier "Configured Monitor" EndSection Now both the left and right monitors have task bars and menu bars. Windows cannot be dragged from the first two monitors to the third monitor. Also, maximizing in the left or center window fills both monitors. I also tried adding Option "Xinerama" "true" to the ServerLayout section. X11 wasn't able to start up. I want to: Allow moving windows along all three monitors. Maximizing only fills the current monitor. Either have menu/task bars on only the left monitor or all three monitors How can I make this possible?

    Read the article

  • Multiple Video Cards - Stuttering

    - by jstawski
    I have two video cards: - XFX PVT84JUDD3 GeForce 8600GT XXX 256MB 128-bit GDDR3 PCI Express x16 SLI Supported Video Card - EVGA 256-P1-N399-LX GeForce 6200 256MB 64-bit GDDR2 PCI Video Card both running the same set of drivers on Windows 7 64-bit. When I work with 2 monitors connected to the 8600GT card everything works smoothly. When I connect the third one to the 6200 then Windows works well and all of a suddon the screens turns black for up to 5 minutes. Then it goes back and at some random interval it goes black again. I can still see the pointer and hit CTRL+ALT+DEL and see the menu to log off, bring the task manager, etc. I've tried changing the 6200 to another PCI slot and the error persists. I've tried connecting 2 monitors only one to each card, same problem. Tried swapping them, mixed and matched the monitors to see if it was a problem with the monitor and my conclusion was that it is not the monitor. The problem also occurred with Vista 64 as well. What could be generating this problem? Can it be the fact that they are different interfaces? Maybe the Motherboard? Should I change something on the BIOS? What do you guys think?

    Read the article

  • Draggable cards (touch enumeration) issue

    - by glitch
    I'm trying to let a player tap, drag and release a card from a fanned stack on the screen to a 4x4 field on the board. My cards are instantiated from a custom class that inherits from the UIImageView class. I started with the Touches sample app, and I modified the event handlers for touches to iterate over my player's card hand instead of the 3 squares the sample app allows you to move on screen. Everything works, until that is, I move the card I'm dragging near another card. I'm really drawing a blank here for the logic to get the cards to behave properly. Here's my code: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { NSUInteger numTaps = [[touches anyObject] tapCount]; if(numTaps = 1) { for (UITouch *touch in touches) { [self dispatchFirstTouchAtPoint:[touch locationInView: self.boardCardView] forEvent:nil]; } } } -(void) dispatchFirstTouchAtPoint:(CGPoint)touchPoint forEvent:(UIEvent *)event { for (int i = 0; i<5; i++) { UIImageView *touchedCard = boardBuffer[i]; if (CGRectContainsPoint([touchedCard frame], touchPoint)) { [self animateFirstTouchAtPoint:touchPoint forView:touchedCard]; } } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { NSUInteger touchCount = 0; for (UITouch *touch in touches){ [self dispatchTouchEvent:[touch view] toPosition:[touch locationInView:self.boardCardView]]; touchCount++; } } My questions are: How do I get the touch logic to disallow other cards from being picked up by a dragging finger? Is there anyway I can only enumerate the objects that are directly below a player's finger and explicitly disable other objects from responding? Thanks!

    Read the article

  • Operative systems on SD cards

    - by HisDudeness
    I was getting some wild ideas the last days, like putting some operative systems into SD cards rather than on my hard drive. I'll go further into details now and explain what lead me to consider this probably abominable decision. I am on a laptop (that means I have a native SD-card reader) which is currently running a cross-distro setup, with a bunch of Linux systems (placed in dedicated ext4 logical partitions into a huge extended one) regulated by an unique GRUB. Since today, my laptop haven't even seen any Windows system with binoculars. I was thinking about placing all the os part of my setup into a Secure Digital to save all my 500 Gb Hard Drive for documents, music, videos and so on, and being able to just remove the SD and boot my system into another computer too, as well as having the possibility of booting other systems into mine by just plugging in another SD, without having to keep it constantly placed in my PC. Also, in the remote case in the near future I just wanted to boot Windows 8 in it, I read it causes major boot incompatibility issues with other systems by needing a digital signature in order for them to start. By having it in a removable drive, I could just get rid of it when I'm needing him and switch its card with Linux one, and so not having any obstacles to their boot. Now, my questions are: I know unlikely traditional rotating disk drives, integrated circuits ones have a limited lifespan in terms of cluster rewriting. Is it an obstacle to that kind of usage? I mean, some Ultrabooks are using SSD now, is it the same issue, or there are some differences between Solid State Drives and Secure Digitals in that sense? Maybe having them to store system files which are in fixed positions (making the even-usage of cluster technology useless) constantly being re-read and updated and similar things just gets them soon unserviceable, do it? Second question: are all motherboards and BIOSes able to boot from SDs just like they are from USB pen drives (I mean, provided card reader is USB-connected, isn't it)? Or can't bootloaders like GRUB be installed on SDs working? If they can't, is it a solution installing GRUB to MBR and making boot option pointing to SD? Will it work? Are there any other problems to installing OSs on a Secure Digital?

    Read the article

  • SATA drive problems with two SIL RAID cards

    - by Jon Topper
    I've just put a second SiI 3114 SATARaid card in my home server so that I could add another pair of SATA drives and increase my storage space. Annoyingly, it doesn't seem to work: [ 32.816030] ata5: lost interrupt (Status 0x0) [ 32.816072] ata5.00: exception Emask 0x0 SAct 0x0 SErr 0x0 action 0x6 frozen [ 32.816091] ata5.00: cmd c8/00:08:00:00:00/00:00:00:00:00/e0 tag 0 dma 4096 in [ 32.816094] res 40/00:00:00:00:00/00:00:00:00:00/00 Emask 0x4 (timeout) [ 32.816101] ata5.00: status: { DRDY } [ 32.816117] ata5: hard resetting link [ 33.136082] ata5: SATA link down (SStatus 0 SControl 0) [ 36.060940] irq 18: nobody cared (try booting with the "irqpoll" option) [ 36.060949] Pid: 0, comm: swapper Not tainted 2.6.31-20-generic #58-Ubuntu [ 36.060954] Call Trace: [ 36.060977] [] ? printk+0x18/0x1c [ 36.060997] [] __report_bad_irq+0x27/0x90 [ 36.061005] [] note_interrupt+0x150/0x190 [ 36.061011] [] handle_fasteoi_irq+0xac/0xd0 [ 36.061023] [] handle_irq+0x18/0x30 [ 36.061029] [] do_IRQ+0x47/0xc0 [ 36.061042] [] ? irq_exit+0x50/0x70 [ 36.061058] [] ? smp_apic_timer_interrupt+0x57/0x90 [ 36.061065] [] common_interrupt+0x30/0x40 [ 36.061075] [] ? native_safe_halt+0x5/0x10 [ 36.061082] [] default_idle+0x46/0xd0 [ 36.061088] [] cpu_idle+0x8c/0xd0 [ 36.061103] [] rest_init+0x55/0x60 [ 36.061111] [] start_kernel+0x2e6/0x2ec [ 36.061117] [] ? unknown_bootoption+0x0/0x19e [ 36.061133] [] i386_start_kernel+0x7c/0x83 [ 36.061137] handlers: [ 36.061139] [] (sil_interrupt+0x0/0xb0) [ 36.061151] Disabling IRQ #18 [ 38.136014] ata5: hard resetting link [ 38.456022] ata5: SATA link down (SStatus 0 SControl 0) [ 43.456013] ata5: hard resetting link [ 43.776022] ata5: SATA link down (SStatus 0 SControl 0) [ 43.776035] ata5.00: disabled [ 43.776055] ata5.00: device reported invalid CHS sector 0 [ 43.776074] sd 4:0:0:0: [sde] Result: hostbyte=DID_OK driverbyte=DRIVER_SENSE [ 43.776082] sd 4:0:0:0: [sde] Sense Key : Aborted Command [current] [descriptor] [ 43.776092] Descriptor sense data with sense descriptors (in hex): [ 43.776097] 72 0b 00 00 00 00 00 0c 00 0a 80 00 00 00 00 00 [ 43.776112] 00 00 00 00 [ 43.776118] sd 4:0:0:0: [sde] Add. Sense: No additional sense information [ 43.776127] end_request: I/O error, dev sde, sector 0 [ 43.776136] Buffer I/O error on device sde, logical block 0 [ 43.776170] ata5: EH complete [ 43.776187] ata5.00: detaching (SCSI 4:0:0:0) root@core:~# cat /proc/interrupts CPU0 0: 47 IO-APIC-edge timer 1: 8 IO-APIC-edge i8042 6: 3 IO-APIC-edge floppy 7: 0 IO-APIC-edge parport0 8: 0 IO-APIC-edge rtc0 9: 0 IO-APIC-fasteoi acpi 14: 53069 IO-APIC-edge pata_sis 15: 53004 IO-APIC-edge pata_sis 17: 112265 IO-APIC-fasteoi sata_sil 18: 200002 IO-APIC-fasteoi sata_sil, SiS SI7012 19: 111140 IO-APIC-fasteoi eth0 20: 0 IO-APIC-fasteoi ohci_hcd:usb2 21: 0 IO-APIC-fasteoi ohci_hcd:usb3 23: 0 IO-APIC-fasteoi ehci_hcd:usb1 NMI: 0 Non-maskable interrupts LOC: 6650492 Local timer interrupts SPU: 0 Spurious interrupts CNT: 0 Performance counter interrupts PND: 0 Performance pending work RES: 0 Rescheduling interrupts CAL: 0 Function call interrupts TLB: 0 TLB shootdowns TRM: 0 Thermal event interrupts THR: 0 Threshold APIC interrupts MCE: 0 Machine check exceptions MCP: 160 Machine check polls ERR: 0 MIS: 0 root@core:~# lspci | grep Raid 00:09.0 RAID bus controller: Silicon Image, Inc. SiI 3114 [SATALink/SATARaid] Serial ATA Controller (rev 02) 00:0a.0 RAID bus controller: Silicon Image, Inc. SiI 3114 [SATALink/SATARaid] Serial ATA Controller (rev 02) root@core:~# lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 9.10 Release: 9.10 Codename: karmic root@core:~# uname -a Linux core.topper.me.uk 2.6.31-20-generic #58-Ubuntu SMP Fri Mar 12 05:23:09 UTC 2010 i686 GNU/Linux I've tried a combination of different kernel options (irqpoll, noapic, noacpi, pci=noapic) all to no avail. Does anyone have any bright ideas about how I can go about making this work? Swapping PCI cards around isn't an option as there are only two slots in this motherboard (an ASRock K7S41GX). The BIOS doesn't look to have too much in the way of configuration options regarding IRQ usage. Plan B is to ditch this server completely and buy a new QNAP for these drives to go in, but I was hoping to avoid doing this right now.

    Read the article

  • Windows 7 interfering with smart cards

    - by Dennis
    I have an application that uses the PC/SC API to communicate with smart cards. On Windows 7, I get strange results: the data returned from the cards is inconsistent and invalid with certain commands. If I disable the Smart Card Plug and Play service in group policy then everything works fine. Has anyone experienced anything similar? Is there any way to get the smart card plug and play service to play nice? It would be nice to not have to disable it...

    Read the article

  • OpenGL/GLSL checking if shader compiled fine on intel cards

    - by clamp
    hello, i am using this code to check if my glsl shader compiled fine. glGetObjectParameterivARB(obj, GL_OBJECT_INFO_LOG_LENGTH_ARB, &infologLength); if (infologLength > 1) { int charsWritten = 0; char * const infoLog = new char[infologLength]; glGetInfoLogARB(obj, infologLength, &charsWritten, infoLog); tError(infoLog, false); delete infoLog; } } the length of the returned string is empty on nvidia and ATI cards, but on intel cards this one returns the string "no errors." now what is the best way to find out, if there are really no errors? should i just check for this string? or is there a convention what this function glGetInfoLogARB should return?

    Read the article

  • CPU Wars Is a Trump-Style Card Game Driven by Chip Stats

    - by Jason Fitzpatrick
    If you’re looking for the geekiest card game around, you’d be hard pressed to beat CPU Wars–a top-trumps card game built around CPU specs. From the game’s designers: CPU Wars is a trump card game built by geeks for geeks. For Volume 1.0 we chose 30 CPUs that we believe had the greatest impact on the desktop history. The game is ideally played by 2 or 3 people. The deck is split between the players and then each player takes a turn and picks a category that they think has the best value. We have chosen the most important specs that could be numerically represented, such as maximum speed achieved and maximum number of transistors. It’s lots of fun, it has a bit of strategy and can be played during a break or over a coffee. If you’re interested, you can pick up a copy for £7.99 (roughly $12.50 USD). Hit up the link below for more information. How To Customize Your Wallpaper with Google Image Searches, RSS Feeds, and More 47 Keyboard Shortcuts That Work in All Web Browsers How To Hide Passwords in an Encrypted Drive Even the FBI Can’t Get Into

    Read the article

  • I need advice on how to best handle an e-commerce situation

    - by Mohamad
    I recently moved to Brazil and started a small subscription based service company. The payment gateway market is under-developed in Brazil, and implementing a local solution is too expensive for me. My requirements are a payment gateway that will automatically process monthly recurrent billing, and that will allow me to manually charge my customers when needed. They would also have to deal with storage and security. I'm leaning towards manually processing payments myself as a restaurant would do, for example, using small swipe machines. Unfortunately, this would require me to store credit card information and I would rather not, but I feel it's my only option. Can anyone give me advice on how to tackle this problem? Do I have other options? If I decide to store credit card information, what should I keep in mind and how should I go about it? I have moderate skills in programming, and through tenacity I can get most things done. I'm afraid that this might be out of my league, however.

    Read the article

  • Handling & processing credit card payments

    - by Bob Jansen
    I'm working on program that charges customers on a pay as you go per month modal. This means that instead of the customers paying their invoices at the start of the month, they will have to pay at the end of the month. In order to secure the payments I want my customers credit card information stored so that they can be charged automatically at the end of the month. I do not have the resources, time, or risk to handle and store my customers credit card information on my servers and am looking for a third party solution. I'm a tad overwhelmed by all the different options and services that are out there and was wondering if anyone with experience have any recommendations and tips. I'm having difficulty finding services that allow me to to store my customers credit card information and charge them automatically. Most of them seem to offer an invoice styled approach.

    Read the article

  • Video and LAN Cards are not recognized on Intel DG41RQ MB

    - by artyom
    Hello, My 32 bit Debian Etch with Etch-n-Half kernel 2.6.24 and Etch-n-half video driver xserver-xorg-video-intel 2.2.1-1~etchnhalf2 does not recognize my on-board LAN and Video. I can work only with VESA display. And additional PCI LAN card (that I wanted to use for Cross-X-Cabel). output of lspci 00:00.0 Host bridge: Intel Corporation 4 Series Chipset DRAM Controller (rev 03) 00:01.0 PCI bridge: Intel Corporation 4 Series Chipset PCI Express Root Port (rev 03) 00:02.0 VGA compatible controller: Intel Corporation 4 Series Chipset Integrated Graphics Controller (rev 03) 00:1b.0 Audio device: Intel Corporation 82801G (ICH7 Family) High Definition Audio Controller (rev 01) 00:1c.0 PCI bridge: Intel Corporation 82801G (ICH7 Family) PCI Express Port 1 (rev 01) 00:1c.1 PCI bridge: Intel Corporation 82801G (ICH7 Family) PCI Express Port 2 (rev 01) 00:1d.0 USB Controller: Intel Corporation 82801G (ICH7 Family) USB UHCI Controller #1 (rev 01) 00:1d.1 USB Controller: Intel Corporation 82801G (ICH7 Family) USB UHCI Controller #2 (rev 01) 00:1d.2 USB Controller: Intel Corporation 82801G (ICH7 Family) USB UHCI Controller #3 (rev 01) 00:1d.3 USB Controller: Intel Corporation 82801G (ICH7 Family) USB UHCI Controller #4 (rev 01) 00:1d.7 USB Controller: Intel Corporation 82801G (ICH7 Family) USB2 EHCI Controller (rev 01) 00:1e.0 PCI bridge: Intel Corporation 82801 PCI Bridge (rev e1) 00:1f.0 ISA bridge: Intel Corporation 82801GB/GR (ICH7 Family) LPC Interface Bridge (rev 01) 00:1f.1 IDE interface: Intel Corporation 82801G (ICH7 Family) IDE Controller (rev 01) 00:1f.2 IDE interface: Intel Corporation 82801GB/GR/GH (ICH7 Family) SATA IDE Controller (rev 01) 00:1f.3 SMBus: Intel Corporation 82801G (ICH7 Family) SMBus Controller (rev 01) 04:01.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL-8139/8139C/8139C+ (rev 10 According to MB Spec: Video: Intel G41 Express Chipset Graphics and Memory Controller Hub Graphics (GMCH) LAN: Realtek 8111D Gigabit Ethernet Controller LAN Support When I run dpkg-reconfigure xserver-xorg is suggest i810 driver but X does not start, it does not start with intel driver as well; only vesa. LAN I even can't find in lspci... the last Realtek card is external one 100MB card. Where to start from?

    Read the article

  • Mixing different generations of ATi video cards

    - by middaparka
    I'm heading in the direction of a three monitor set up and hence suspect I need to go the dual video card route. At the moment, I've a Radeon X1900GT driving two 19" monitors which I'd like to keep and augment with something like a Radeon 5570 which would drive a 22" display. As such, I'm just wondering how Catalyst will cope with this configuration and whether there are any other issues that I'm likely to encounter. In terms of the obvious, my motherboard has two PCI-E slots and my power supply has two video card power connectors, so I'm hopefully covered from that side of things. Thanks in advance.

    Read the article

  • Copying SD Cards with "LaCie d2 Network"

    - by rjstelling
    The LaCie d2 Network has a feature where by you can attach a USB drive and press the blue button at the from and it will copy the drive contents (no computer required). (See this review for more info). USB 2.0 and eSATA ports are also provided but these are not designed for extending the d2 Network's storage. Rather, they allow you to connect portable drives for uploading their data to the d2 Network directly. The process is quite slick, too: just plug in a drive and press the big blue button on the front of the unit to trigger an immediate upload. This copies over everything on the external device and seems ideal for camera use. Is it possible to use an microSD or SD card adapter (like the Kingston MobileLite 9-in-1) and copy the contents of the card? I'm assuming the card reader just "looks like" a normal USB flash drive the computer or (in this case) LaCie d2 Network. Is this assumption correct? Do you know any reason why this won't work?

    Read the article

  • vSphere 5.5: Binding VMs to HBA cards on the host

    - by red888
    I am settings up a lab and wanted to be sure the following makes sense\is possible: One server running vsphere with two fiber HBAs 2 Windows 2012 Hyper-V VMs each bound to an HBA I'm using vsphere because it supports nested visualization, but I'm really setting up this lab to test out hyper-v and live migrations. Will I easily be able to bind each VM to a physical HBA on the host or are there any caveats I should know about?

    Read the article

  • 3 Monitors 2 different graphic cards

    - by Michael
    I am looking to add a third monitor to my workstation. I currently have 2 monitors connected to a geforce gt 120. I am wanting to find the cheapest solution on adding another monitor, I have a old ati x300 laying around and wanting to know if I would be able to use that to power the other monitor and if it would conflict with my other graphic card (I am not a gamer). What I want to do is have the at x300 power 1 monitor via dvi and the gt 120 power 2 monitors via vga and dvi. Thanks edit: im running windows 7 64bit

    Read the article

  • How to print 4 index cards on a single A4 sheet in Word 2003

    - by Anna
    I have an index card designed in Word. It's fairly complicated with graphics, borders and background. The page layout has been set to landscape and with size set to 4x6. How can I print this, 4x per A4 landscape sheet? I cannot for the life of me work it out. The printer always seems to do a single card per A4 sheet, wasting 3/4 of the page. "Pages 1,1,1,1" will result in 4 sheets being printed. What am I doing wrong?

    Read the article

  • Different graphic cards drivers while booting from external media

    - by goran
    I am booting a certain system of mine with ubuntu 9.10 from external HDD. I am satisfied with the setup and it works fine, however I would like to modify it so that I can choose which graphic card drivers to load during the boot time. Specifically I would like to choose between: nvidia proprietary driver ati proprietary driver generic driver Currently if I am using proprietary drivers then dont boot into X, delete xorg.conf, start gdm and reconfigure the system using jockey (for hardware drivers). What would be the steps to make this (semi-)automatic and avoid restarting X?

    Read the article

  • Storing Cards and PCI Compliance

    - by Nimbuz
    I'm developing a SaaS service and will be managing payments as a merchant for customers, and since we'll be using multipe payment processors depending on users location, amount and other factors so its important to store card details. I did some research and from what I understood all you need is a PCI compliant host (VPS, Dedicated or Private Cloud) and get it validated and certified through some provider like TrustWave etc... Is that correct or am I missing something? Also, would be great if you could suggest a few (not necessasrily cheap, but affordable) PCI compliant hosts. Many thanks

    Read the article

  • Windows 7 with two network cards doesn't route traffic

    - by Tomek
    I have simple task to do: I have wni7 with two nics.I want to connect another comp(osx) to win7 through second nic to connect it to internet. I already changed the registry. Win7 interface with 192.168.2.1 has no gateway set (no point to do that) OSX interface with 192.168.2.2 has gateway set to 192.168.2.1 I do not add any routes on win7, every thing seems to be already there network on second nic is detected as "undefined network" (probably effect of no gateway) i can achieve any connectivity to internet from OSX only by enabling network connection sharing on nic with 192.168.2.1, but it enables NAT and I'm interested only in pure routing without nat(it's a setup for some research). firewall is off. It seems to me that win7 refuses to forward packets for some reason. Perhaps "undefined network" and NLA service is to blame, although i couldn't find any info about that. Below ascii schematics of my setup: internet<--router(192.168.1.1)<--(192.168.1.1) WIN7 (192.168.2.1)<--(192.168.2.2)OSX Thanks

    Read the article

  • Can there be multiple monitors with these video cards and setup

    - by z_Zelman
    I have a Acer Aspire TimelineX 4830TG-6808 laptop running windows 7 Ultimate with one VGA port and one HDMI port. It has one Intel HD Graphics family card and one NVIDIA GeForce GT 540M. Right now I have one monitor running via the VGA port, and also the basic laptop monitor. When I look at the monitor setup, I see one unused video card available. Can I have another monitor via the HDMI port? If I cannot run three monitors, is there some way I could make a large "single" desktop that extends over two monitors? For the sake of celerity, maybe something like having a virtual machine window that extends over two monitors. Could this maybe come from third party software?

    Read the article

  • Why are graphics cards upside-down?

    - by gbjbaanb
    This is something that has always bugged me - when I install a card into a desktop (ie mini tower) case, the fan is always facing down. Surely, making the card so the components and fan is on the top would help a lot with cooling, allowing those whiney fans to spin a little slower. I know some card manufacturers tried to mitigate this by adding heat pipes and big heatsinks on the back of the card.. but they still put the bits on the same way as everyone else! So, does anyone know why they're all upside-down?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >