Search Results

Search found 1598 results on 64 pages for 'mini dvi'.

Page 1/64 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Apple video adapters: Mini-DVI to DVI + DVI to video?

    - by Ken
    I have: a Macbook (which has Mini-DVI output) an Apple Mini-DVI to DVI adapter (so I can hook up my Macbook to my DVI LCD) a Mac mini (which has DVI output) I see that I can get a DVI-to-Video (s-video and composite) adapter from Apple that will let me hook up my Mac mini to my TV set (which has only component, s-video, and composite -- nothing digital). So far so good. Question: Will that same adapter also let me hook my Macbook to my TV? That is, can I hook Macbook - Mini-DVI to DVI - DVI to video - TV set, and see picture? I know there are digital/analog/integrated variants of DVI, and it's not at all clear to me what pins these things have and what signals they're sending. One website I found suggested that it would work, and another suggested that it wouldn't even physically connect. So I'm looking for, ideally, someone who's actually tried it, or has these adapters sitting around to try. I know I can buy 2 adapters (Mini-DVI to Video, and DVI to Video) to do this, but at $20 a pop I'll avoid that if at all possible.

    Read the article

  • Any reason to prefer video adapter with two DVI ports versus one DVI/one VGA for DVI/VGA optional dual monitors?

    - by Bryce Thomas
    I am looking to buy a new video card to power two identical monitors. The monitors came with both DVI and VGA cables, so I am able to use either. My current video card has two DVI ports on the back, so I have both monitors connected via DVI at present. I have noticed that many modern video cards have a DVI/VGA/HDMI port trio and that cards with two DVI ports seem somewhat more scarce. Essentially, I have more options available to me for purchasing cards with a DVI/VGA/HDMI trio than with a DVI/DVI duo. My question is, are there any sound reasons to go to the extra effort of finding a card with two DVI ports versus simply running one of my monitors through a DVI and one through a VGA on a DVI/VGA/HDMI card? Quality differences? Any variety of image asymmetry? Configuration difficulties (I dual boot Windows and Ubuntu)? Anything else?

    Read the article

  • Dual-headed graphics card choice: DVI & VGA with 512MB _or_ DVI & DVI with 256MB

    - by TimH
    Which would you choose? Some more detail: I can choose between: A dual-headed card with both heads DVI but only 256MB of memory A dual-headed card with one VGA and one DVI, with 512MB of memory. Both monitors are 1600x1200 I'll be doing mostly business app development on the computer. No gameplay or advanced graphics work. It's running Win7 and is a quad-core i5. I'm thinking of going with 256MB one, just so both displays are DVI and I don't have to shift between sharp & blurry when I look from one screen to the other. But I'm not sure if the additional RAM would be a huge boon for some reason (Win7 GPU acceleration, for example? But with a quad-core, who cares?).

    Read the article

  • Mini DisplayPort -> DVI -> VGA ?

    - by ibz
    I have a Mini DisplayPort to DVI adapter which I use to connect my MacBook to a screen that has DVI input. I also have a screen with VGA only input which I want to be able to use, so instead of buying yet another expensive Apple adapter (the Mini DisplayPort to VGA), I just got a cheap DVI to VGA, so I can do Mini DisplayPort - DVI - VGA. It doesn't seem to work though. The screen just says "no connection". Does anyone know if this is actually supposed to work (and my DVI - VGA is just broken), or is this simply not supported and I need to get the expensive Apple Mini DisplayPort to VGA? Thanks!

    Read the article

  • DVI splitter not working as expected/confusion between DVI-D and -I

    - by Freakishly
    Hey guys, thanks for looking. I have an ATI FirePro™ V3700 in my desktop machine, and I have been running a dual-monitor setup quite effortlessly, thanks to the two DVI ports on the card. I came upon a third monitor, and wanted to extend my desktop to 3 screens, so I purchased a DVI splitter from Amazon. Now, I can only duplicate the second monitor onto the third, not extend it. I've tried all possible combinations of input to no avail. Here's the setup: The ATI FirePro™ V3700 has two Dual-Link DVI-I outputs The splitter splits a single Dual-Link DVI-I port into two Dual-Link DVI-I outputs Two of the monitors are NEC E222W, and the third monitor is a Dell 2001FP. Each monitor has one D-Sub and one Dual-Link DVI-D input. Cables going from the video card to the monitors are two Dual-Link DVI-D to the NECs and one Single-Link DVI-D to the Dell. Is the problem likely with the DVI-D/DVI-I mismatch? Or is it with the cable on the Dell that is only a Single-Link? The cables are easily replaceable, the monitors not so much. Thanks for your time, I really appreciate it. http://www.amd.com/us/products/workstation/graphics/ati-firepro-3d/v3700/Pages/v3700-specs.aspx http://www.amazon.com/Cables-Unlimited-DVI-D-Splitter-PCM-2260/product-reviews/B000H09RFM/ref=dp_top_cm_cr_acr_txt?ie=UTF8&showViewpoints=1 www dot newegg dot com/Product/Product.aspx?Item=N82E16824002495 accessories dot us dot dell dot com/sna/PopupProductDetail.aspx?cs=19&l=en&c=us&sku=320-1578 Apologies for the fudged links, I'm new here and they won't let me post more than two :P

    Read the article

  • Converting DisplayPort and/or HDMI to DVI-D?

    - by Jeff Atwood
    Newer Radeon video cards come with four ports standard: DVI (x2) HDMI DisplayPort If I want to run three 24" monitors, all of which are DVI only, from this video card -- is it possible to convert either the HDMI or DisplayPort to DVI? If so, how? And which one is easier/cheaper to convert? I did a little research and it looks like there isn't a simple "dongle" method. I found this DisplayPort to DVI-D Dual Link Adapter but it's $120; almost cheaper to buy a new monitor that supports HDMI or DisplayPort inputs at that point! There's also a HDMI to DVI-D adapter at Monoprice but I'm not sure it will work, either. AnandTech seems to imply that you do need the DisplayPort-to-DVI: The only catch to this specific port layout is that the card still only has enough TMDS transmitters for two ports. So you can use 2x DVI or 1x DVI + HDMI, but not 2x DVI + HDMI. For 3 DVI-derived ports, you will need an active DisplayPort-to-DVI adapter.

    Read the article

  • Mini DisplayPort to DVI-D Dual Link up to 2560 * 1600

    - by Steinway Wu
    I have a ThinkPad T530 with Mini DP++ and NVS 5400M, which are said to support 2560 * 1600 resolution. The display is Dell 3007WFP, which also supports 2560 * 1600. But, I have never seen any cables or adapters that can connect them with full support to 2560 * 1600. Do you have any idea? THANK YOU! In addition, I found that mDP++ means I can use a passive adapter. No need for active adapter which is much more expensive. This is the only one that meets my needs, but it is not sold in the US. And this one seems to be OK, but I am not sure because it says "2.5 Gbps per channel", which may not be enough.

    Read the article

  • Which type of DVI cable will connect the 9400GT with this display?

    - by Senthil
    I am going to buy BenQ G2220HD 22" LCD display I already have an EVGA GeForce 9400GT. Both support DVI. But the confusion came when I wanted to buy a DVI cable. I am confused with DVI and DVI-D and DVI-I and then Single link and Dual link! Can someone tell me whether this cable can be used to connect my 9400GT to my BenQ display? If not, what kind of DVI cable should I buy - DVI, DVI-D, single link, dual link, DVI-I?? Basically, geforce docs say it is a dual link dvi capable one. my benq display docs say it is dvi-d. what are my options now? what kind of dvi cable should I get? Please don't give me brands, because most of them won't be here in India. Just tell me the TYPE of DVI cable.

    Read the article

  • DVI-D Splitter Not Working with GeForce 8400gs

    - by jimdrang
    I have a GeForce 8400gs and it has a DVI and VGA port on the back. I was using dual monitors with one VGA and one DVI cable. I wanted both displays to be digital so I bought a DVI-D splitter and put one DVI cable in each monitor, connected them to the splitter and put the single merged connection in the back of the cards DVI connection. It will not recognize the second monitor (I'm not even sure how it determined which one was the first monitor). The tech specs state that it supports "Two dual-link DVI outputs for digital flat panel display resolutions up to 2560x1600" http://www.nvidia.com/object/geforce_8400_tech_specs.html. Do I need a different converter or is my only option for dual monitors with this card one VGA, one DVI?

    Read the article

  • Dual DVI and VGA Out

    - by Michael
    I am wanting to add a third monitor with one graphic card. I have been told that a DVI to Dual DVI will display separate views. Right now I do DVI and VGA out. If I get a DVI to Dual DVI out will my VGA still work? Thanks

    Read the article

  • Why would video stutter on HDMI but not on DVI?

    - by CorvT
    I've got a system running Ubuntu 12.04 with an i3 2120T CPU/GPU. When I play video through mplayer, I notice when I'm hooked up to a screen via HDMI there is a small stutter (1-2 frames) every few seconds. I don't see this happening when I connect via DVI on the same screen. Resolution and refresh rate are same for both HDMI and DVI, so I'm not sure where else the problem could be coming from. I've also tried two different screens, and different cables. I see the stutter with either HDMI-HDMI cables, or DVI-HDMI cable with DVI from the PC and HDMI into the screen. I don't see the stutter with DVI-DVI cables, or when I use HDMI-DVI cables with HDMI from the PC to DVI into the screen. I've also tried using an AMD 5XXX series card with the open source radeon driver, and saw the same problem. I then tried an nVidia GeForce 210 card with the closed source driver, and the stutter went away. To me this smells like a driver/mesa/glx issue (since the problem went away with the nvidia card/driver), but I have no idea how to track this down.

    Read the article

  • DVI+HDMI out, at startup only HDMI is available

    - by Alasjo
    I have my computer next to my hdtv. The main screen is connected via DVI while the tv is connected via HDMI. If I start the computer without the HDMI plugged in, everything is ok: I see the login screen and sound is output through analog out. But if the HDMI is plugged in before I start the computer, only the tv gets an image (the login screen), the main screen is black or at some times purple, but even after login the main screen is black. Also sound is still output through analog out. Not sure whether it's a hw issue, or an Ubuntu issue, or a combined hardware/Ubuntu compatibility issue (Sandy Bridge). This is my setup: Ubuntu 11.10 Oneiric Ocelot (64bit) ASUS P8H67-M LE Intel Core i3-2100 I don't have any custom video settings, my main screen is recognized properly when HDMI is not plugged in at startup. Cheers.

    Read the article

  • Monitor displays "No VGA signal" "Check DVI cable" after installing motherboard drivers

    - by user1604220
    I bought computer with all of the needed parts, got everything sorted out, installed windows 7 but my CD-ROM doesn't fit my motherboard. So I had to download the driver manually and install it using USB disc. My motherboard: ECS elitegroup H61H2-M2, I downloaded it's drivers, installed the BIOS map or something, and then the computer forced a reboot after it was complete, after the reboot my monitor just stopped working and displayed No VGA signal, No DVI cable I think I've just installed the wrong driver, well but how can I sort it out? This is the driver I installed. On the book, it tells me to install the drivers, without the driver it won't see the Internet connection cable. I'm 100% sure there is nothing wrong with the monitor or it's cables. it stopped working exactly after the reboot after the installation.

    Read the article

  • Dell PR03X port replicator and DisplayPort to DVI adapter not detecting second monitor

    - by yothenberg
    I have a Dell M4400 connected to a PR03X port replicator/docking station. I use the DVI port to connect it to a first Dell 2208WFP monitor and I'm trying to use a DisplayPort-to-DVI adapter to connect it to a second Dell 2208WFP monitor. The second monitor, connected via the DisplayPort-to-DVI adapter immediately goes into sleep mode and the laptop doesn't detect it. What is really weird is that it did detect it the first time I plugged it in but after I unplugged the monitor and plugged it back in it stopped working. I swapped the monitors round and it detected them both but after unplugging the monitor connected via the DisplayPort-to-DVI and plugging it in again it stopped working. Both monitors work if plugged in directly to the DVI port. Is there some way to force re-detection? Any ideas?

    Read the article

  • Using using a DVI instead of a VGA increase computer screen monitor resolution

    - by Jessica M.
    This is for a computer screen. Both my video card and computer screen have DVI ports on them but I'm using a VGA. I'm wondering if there would be a difference if i switch from VGA to DVI. My current resolution is 1920x1080 with the VGA cord. will my resolution increase if i switch to a DVI? Are there any advantage of switching from VGA to DVI? If there will be a difference if i switch, I've read there are 5 different type of DVI cables. how do I know which one to buy?

    Read the article

  • Dell PR03X port replicator and DisplayPort to DVI adapter not detecting second monitor

    - by yothenberg
    Hi, I have a dell M4400 connected to a PR03X port replicator/docking station. I use the DVI port to connect it to a first Dell 2208WFP monitor and I'm trying to use a DisplayPort-to-DVI adapter to connect it to a second Dell 2208WFP monitor. The second monitor, connected via the DisplayPort-to-DVI adapter immediately goes into sleep mode and the laptop doesn't detect it. What is really weird is that it did detect it the first time I plugged it in but after I unplugged the monitor and plugged it back in it stopped working. I swapped the monitors round and it detected them both but after unplugging the monitor connected via the DisplayPort-to-DVI and plugging it in again it stopped working. Both monitors work if plugged in directly to the DVI port. Is there some way to force re-detection? Any ideas? Thanks, Mark

    Read the article

  • How to use a DVI/HDMI monitor with Mac Mini's DisplayPort?

    - by dr dork
    If I buy a Mac Mini, how can I use it with my Dell monitor? The Dell does not have a DisplayPort, only DVI and HDMI. I was looking at the Mac Minis and noticed they don't have DVI or VGA ports. Does the Mac Mini come with any kind of an adapter that will allow me to use my existing Dell monitor? Thanks in advance for your help, I'm going to start researching this question right now.

    Read the article

  • Can I connect a Playstation 3's HDMI output to my monitor's DVI-D input? [migrated]

    - by HankJDoomstorm
    I'm attempting to connect my Playstation 3 to my computer monitor. The monitor has a DVI-D (dual link) input, so before distinguishing between the different DVI varieties, I bought a DVI-I (dual link) to HDMI converter that won't fit into the port on the monitor (not only that, there isn't enough physical space in the back of the monitor to fit that much stuff before it hits the bottom of it). So I grabbed a DVI-D (single link) cable and got a female-to-female DVI-I coupler, and plugged the DVI-D cable into the monitor and the whole mess of converters. The end result was HDMI to DVI-D single link, but my monitor isn't receiving a signal on its digital channel. (For clarity's sake: DVI-D DL input on Monitor, DVI-D SL cable, DVI-I DL female-to-female coupler, DVI-I DL to HDMI converter, HDMI output on PS3) I don't know much about this stuff (obviously), but my educated guess is that the bandwidth of the PS3 is too high for the DVI-D Single Link cable, so nothing's getting through. Will replacing the single link cable with dual link resolve this? If not, is it possible at all? Oh, I should mention I'm aware I won't get audio through the monitor. I have an RCA to 3.5mm converter for that.

    Read the article

  • Ubuntu/Nvidia lists DVI dual cable as single

    - by Joseph Mastey
    I have an NVidia Quadro FX 880M graphics card, from which I am trying to drive 2 monitors: my internal laptop montior (15.6", 1920x1080, Nvidia driver says it's running via DisplayPort) and an external 27" monitor (Dell U2711, 2560x1440 native resolution, via DVI). I've hooked the dual DVI cable to the dual DVI port on my dock (Dell PR03X) and installed the proprietary NVidia driver, but I cannot seem to get the full 2560x1440 out of the larger 27" external monitor. Looking at the NVidia driver settings, the monitor's connection is reported as a single DVI cable, rather than a dual one, which would explain the reduced resolution. Does anyone have any experience with an issue like this? What can I do to make full use of my new monitor? (Possibly) Relevant Information: There is no DVI port on the laptop itself, but one is provided via the dock. The laptop and dock both provide a DisplayPort jack, but I have been unable to get this working on either w/ the monitor. I did have the nouveau driver installed when I installed the nvidia proprietary driver, but have since removed it (no change in the monitor situation when I removed it). The 27" reports a max resolution of 1680x1050. Thanks, Joe

    Read the article

  • RGB to DVI adapter not working for Projector-to-PC

    - by user897052
    We have a wall-mounted projector (dell 4320) with VGA In and the other end of the cable is RGB. We have an RGB to DVI-I adapter (w/ DIP switches) inorder to plug the cable into a PC (video card has 1 VGA and 1 DVI-I port). We had no problems until recently, when the PCs hard drive crashed. After re-imaging the PC, the computer no longer "sees" the projector (the computer doesn't detect the projector as a second monitor anymore). The PC is an older Dell with an after-market video card and runs win XP. I also tried it on a new HP box (win 7), but had to add a DVI-I to DVI-D adapter. Any ideas?

    Read the article

  • Installing Ubuntu 12.10 on Mac Mini (End 2009)

    - by Till Lange
    I am trying to installing Ubuntu 12.10 on my Mac Mini (End 2009). This sounds like a often asked question and I all ready read lots of stuff about it, but now I am confused. While some people say, there are heavy problems by installing Ubuntu on a Mac Mini, such as components (like Sound card, graphic card, …) don't work very well, others say, that it should work perfectly fine. I figured out that installing UBNTU on a Mac isn't as easy (can't just downld the .iso and use unetboot, …) as it is on a PC. So, is it possible to install Ubuntu 12.10 on a Mac Mini 2009 (end), so everything works well?

    Read the article

  • Windows 7 blank dvi screen

    - by user99
    I've just upgraded to Windows 7 Ultimate 64 bit. I had an issue during installation; after setup rebooted instead of going to the 'Completing installation' screen I just got a blank screen. I eventually(!) figured out that this meant 'un-plug your second monitor to proceed'. When I did this, installation completed in a snap and everything runs fine. However, whenever I plug my second monitor into my PC, it gets no signal, the the primary monitor removes all windows. icons, the taskbar and the cursor just shows the desktop wallpaper. I'm running a GTS8800 512mb, with the latest drivers (197 IIRC). The monitors are identical, and both plug into DVI sockets on the graphics card, the only difference is I connect one using a straight DVI cable and the other using a VGA cable and a VGA-DVI converter. It's the DVI cabled one that has the issues (if I plug it in by itself it gets no signal). Everything was working fine before I upgraded to Windows 7 (I used to run XPSP2). Anyone have any ideas?

    Read the article

  • How to setup Dual Head with "radeon" driver for R770?

    - by user1709408
    I want to make dual head setup without xrandr but with Xinerama. I put "Screen 1" line into xorg.conf, but card still show identical output on DVI-2 and DVI-3 It is important to use xinerama for me (to glue three monitors), that's why i decide not to use ranrd (randr is incompatible with xinerama as i read somewhere) Here is my videocard (HD 4850 X2): lspci | grep R700 03:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI R700 [Radeon HD 4850] 04:00.0 Display controller: Advanced Micro Devices [AMD] nee ATI R700 [Radeon HD 4850] Here is how monitors are connected: grep "DVI" /var/log/Xorg.0.log [ 1210.002] (II) RADEON(0): Output DVI-0 using monitor section Monitor0 [ 1210.048] (II) RADEON(0): Output DVI-1 has no monitor section [ 1210.079] (II) RADEON(0): EDID for output DVI-0 [ 1210.080] (II) RADEON(0): Printing probed modes for output DVI-0 [ 1210.128] (II) RADEON(0): EDID for output DVI-1 [ 1210.128] (II) RADEON(0): Output DVI-0 connected [ 1210.128] (II) RADEON(0): Output DVI-1 disconnected [ 1210.128] (II) RADEON(0): Output DVI-0 using initial mode 1920x1200 [ 1210.160] (II) RADEON(1): Output DVI-2 using monitor section Monitor2 [ 1210.215] (II) RADEON(1): Output DVI-3 has no monitor section [ 1210.246] (II) RADEON(1): EDID for output DVI-2 [ 1210.247] (II) RADEON(1): Printing probed modes for output DVI-2 [ 1210.299] (II) RADEON(1): EDID for output DVI-3 [ 1210.300] (II) RADEON(1): Printing probed modes for output DVI-3 [ 1210.300] (II) RADEON(1): Output DVI-2 connected [ 1210.300] (II) RADEON(1): Output DVI-3 connected [ 1210.300] (II) RADEON(1): Output DVI-2 using initial mode 1920x1200 [ 1210.300] (II) RADEON(1): Output DVI-3 using initial mode 1920x1200 Here is my /etc/X11/xorg.conf Section "ServerFlags" Option "RandR" "0" Option "Xinerama" "1" EndSection Section "ServerLayout" Identifier "Three Head Layout" Screen "MyPrecious0" Screen "MyPrecious2" RightOf "MyPrecious0" Screen "MyPrecious3" LeftOf "MyPrecious0" EndSection Section "Screen" Identifier "MyPrecious0" Monitor "Monitor0" Device "Device300" EndSection Section "Screen" Identifier "MyPrecious2" Monitor "Monitor2" Device "Device400" EndSection Section "Screen" Identifier "MyPrecious3" Monitor "Monitor3" Device "Device401" EndSection Section "Device" Identifier "Device300" BusID "PCI:3:0:0" Screen 0 Driver "radeon" EndSection Section "Device" Identifier "Device400" BusID "PCI:4:0:0" Screen 0 Driver "radeon" EndSection Section "Device" Identifier "Device401" BusID "PCI:4:0:0" Screen 1 Driver "radeon" EndSection Section "Monitor" Identifier "Monitor0" EndSection Section "Monitor" Identifier "Monitor2" EndSection Section "Monitor" Identifier "Monitor3" EndSection I tried to switch to vesa driver (didn't work for me) I tried to add options like Option "ZaphodHeads" "DVI-2" and Option "ZaphodHeads" "DVI-3" into sections "Device 400" and "Device 401" (this didn't help because "ZaphodHeads" option is for ranrd, and randr is disabled by decision) I tried to merge sections "Device 400" and "Device 401" into one section and add Option "ZaphodHeads" "DVI-2,DVI-3" (see comment about randr above) single section setup helps to change log line RADEON(1): Output DVI-3 has no monitor section into RADEON(1): Output DVI-3 using monitor section Monitor3 but nothing was enough to switch from screen cloning to separate screens. This problem (lack of documentation on radeon driver) is similar to these: Radeon display driver clones monitors while using Xinerama (moderators decision to close that problem was wrong) Ubuntu 12.10 multi-monitor setup isn't working The problem is solvable, because this hardware worked as three headed for me earlier with gentoo/xorg-server-1.3 Xorg -configure creates setup for the first monitor on the first GPU Please don't advise to use fglrx/aticonfig/amdcccle (this goes against my religion beliefs)

    Read the article

  • DVI vs VGA on Windows 7

    - by Joe Philllips
    I have a 3 monitor setup (each monitor is exactly the same). Two videocards, each with one splitter (one DVI and one VGA). I have two monitors hooked up using DVI and the third is connected through the VGA connection. I am running Windows 7. If I resize a window from one DVI monitor to the other, it's not a problem. It does so very smoothly. If I resize a window on the VGA monitor it is extremely choppy. Why? It's not choppy on other machines with only VGA connections. Has anyone else noticed this?

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >