Search Results

Search found 10209 results on 409 pages for 'multi monitor'.

Page 69/409 | < Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >

  • Can I use Server Monitor with a non-server Mac?

    - by Chris
    I have a computer I use as a simple file and web server. I am simply desiring to be able to monitor load, traffic, memory usage, etc via Server Monitor. I have also downloaded an app for my iPhone which does the same thing, but it uses the same protocol that Server Monitor does. Is it possible to get Server Monitor to recognize my non-server box as a server so I can monitor this information? For reference, I am running 10.4.ll on this PPC box. Thanks in advance!

    Read the article

  • How to output to S-Video on Windows XP in single-monitor mode (clone display)?

    - by Jephir
    I am using an IBM ThinkPad T30 running Windows XP. It is connected to a TV system via S-Video output. Windows is currently set up in multi-monitor mode with monitor 2 being output to S-Video. However, since there is no preview monitor, I can't interact with anything I drag to monitor 2 as I can't see it. I would rather have the system set up so that there is only one monitor and the output is cloned to the laptop display and S-Video. This allows me to see what I'm doing on the TV system. Is this possible?

    Read the article

  • Can I sleep one of the displays on a dual-monitor setup? [duplicate]

    - by archedpenguin
    This question already has an answer here: Can I sleep one of the displays on a dual-monitor setup (running Windows 7)? 4 answers I want to be able to 'put the display to sleep' on one of my two monitors when it isn't needed, so it doesn't distract me or use unnecessary power. Ideally, the display would be asleep, but the OS would remain in dual-monitor mode, so I could still have a variety of windows open in the sleeping monitor's display space, which would mean I wouldn't have to keep switching between single- and dual-monitor modes. Its the same as "Can I sleep one of the displays on a dual-monitor setup (running Windows 7)?" I just wasn't sure I could comment on such an old thread. None of the answers there provided a perfect solution and I was wondering if there is now a solution available.

    Read the article

  • How can I automatically switch audio to my speakers when my TV-as-2nd-monitor is not in use?

    - by Michael McGowan
    I have a normal LCD monitor as my primary monitor and an HD LCD television as a 2nd monitor (connected through HDMI). I also have a set of normal speakers for the computer (a Windows 7 machine) that I previously used (before I was using the TV as a 2nd monitor). When I am using the TV as a 2nd monitor, I would like audio to come from it. However, I'm oftentimes using the TV as a TV, in which case I would like the audio from my computer to come from my speakers. Is there any way to accomplish this? It seems that if I have the TV set up as the default audio, then even if I turn the TV off (or, more likely, to the input from my cable box), then the audio still goes through that rather than my speakers. Is there a solution that does not require me to manually change the settings every time I want to switch contexts?

    Read the article

  • What is the process for diagnosing a ViewSonic Monitor? [closed]

    - by Phxvyper
    I have a ViewSonic monitor that has been busted for quite some time. I've been researching it and i still have no clue as to why it doesn't work. It doesn't turn on. Or rather, it will turn on, but it will not display anything. How do i go about diagnosing the monitor? The monitor is a: ViewSonic VG2230WM Model #: VS11422 Fix: I opened up the monitor and replaced all of the burst capacitors. The monitor works beautifully now.

    Read the article

  • Can I use Server Monitor with a non-server Mac?

    - by Chris
    I have a computer I use as a simple file and web server. I am simply desiring to be able to monitor load, traffic, memory usage, etc via Server Monitor. I have also downloaded an app for my iPhone which does the same thing, but it uses the same protocol that Server Monitor does. Is it possible to get Server Monitor to recognize my non-server box as a server so I can monitor this information? For reference, I am running 10.4.11 on this PPC box. Thanks in advance!

    Read the article

  • Nvidia Drivers on Debian / Lenny (Stable) -> Installation successful -> Monitors gets black

    - by David
    I have successfully installed the proprietary drivers for my nvidia (geforce 7300 gt) graphics card on debian/lenny. I know its not the best way to chose for driver installation ( see this link: http://wiki.debian.org/NvidiaGraphicsDrivers#non-freedrivers ). but the two ways seem to be possible for me (nvidia-kernel module compilation). Now the problem is that the monitors gets black, the power light starts blinking after i launch the x-server. Have a short look a the logs (output truncated from /var/log/Xorg.0.log): (II) Setting vga for screen 0. (**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32 (==) NVIDIA(0): RGB weight 888 (==) NVIDIA(0): Default visual is TrueColor (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0) (**) Jul 28 17:10:11 NVIDIA(0): Enabling RENDER acceleration (II) Jul 28 17:10:11 NVIDIA(0): Support for GLX with the Damage and Composite X extensions is (II) Jul 28 17:10:11 NVIDIA(0): enabled. (II) Jul 28 17:10:11 NVIDIA(0): NVIDIA GPU GeForce 7300 GT (G73) at PCI:1:0:0 (GPU-0) (--) Jul 28 17:10:11 NVIDIA(0): Memory: 262144 kBytes (--) Jul 28 17:10:11 NVIDIA(0): VideoBIOS: 05.73.22.25.00 (II) Jul 28 17:10:11 NVIDIA(0): Detected PCI Express Link width: 16X (--) Jul 28 17:10:11 NVIDIA(0): Interlaced video modes are supported on this GPU (--) Jul 28 17:10:11 NVIDIA(0): Connected display device(s) on GeForce 7300 GT at PCI:1:0:0: (--) Jul 28 17:10:11 NVIDIA(0): Samsung SyncMaster (CRT-0) (--) Jul 28 17:10:11 NVIDIA(0): Samsung SyncMaster (DFP-0) (--) Jul 28 17:10:11 NVIDIA(0): Samsung SyncMaster (CRT-0): 400.0 MHz maximum pixel clock (--) Jul 28 17:10:11 NVIDIA(0): Samsung SyncMaster (DFP-0): 165.0 MHz maximum pixel clock (--) Jul 28 17:10:11 NVIDIA(0): Samsung SyncMaster (DFP-0): Internal Single Link TMDS (II) Jul 28 17:10:11 NVIDIA(0): Assigned Display Device: CRT-0 (==) Jul 28 17:10:11 NVIDIA(0): (==) Jul 28 17:10:11 NVIDIA(0): No modes were requested; the default mode "nvidia-auto-select" (==) Jul 28 17:10:11 NVIDIA(0): will be used as the requested mode. (==) Jul 28 17:10:11 NVIDIA(0): (II) Jul 28 17:10:11 NVIDIA(0): Validated modes: (II) Jul 28 17:10:11 NVIDIA(0): "nvidia-auto-select" (II) Jul 28 17:10:11 NVIDIA(0): Virtual screen size determined to be 1280 x 1024 (--) Jul 28 17:10:11 NVIDIA(0): DPI set to (85, 86); computed from "UseEdidDpi" X config (--) Jul 28 17:10:11 NVIDIA(0): option (==) Jul 28 17:10:11 NVIDIA(0): Enabling 32-bit ARGB GLX visuals. (--) Depth 24 pixmap format is 32 bpp Here is the complete /etc/X11/xorg.conf file as generated by nvidia-xconfig: # nvidia-xconfig: X configuration file generated by nvidia-xconfig # nvidia-xconfig: version 256.35 (buildmeister@builder101) Wed Jun 16 19:25:59 PDT 2010 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" EndSection Section "Files" EndSection Section "Module" Load "dbe" Load "extmod" Load "type1" Load "freetype" Load "glx" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" Identifier "Monitor0" VendorName "Unknown" ModelName "Unknown" Hor

    Read the article

  • Triple (3) Monitors under Linux

    - by widgisoft
    I have a 3 monitor setup (each 1680x1050) via an Nvidia NVS440 (2 GPUs, 2 outputs per GPU totalling 4 outputs); this works fine under Windows XP,7 but caused considerable headaches under Linux (Ubuntu 9.04). I had previously used an XFX 9600GT and the onboard XFX 9300GS to produce the same result but the card was noisy and power hungry and I was hoping that there was some magical switch in the NVS4400 that got rid of this annoying problem - turns out the NVS440 is just 2 cards on one physical PCB :-p (I searched the net high and low for people using this card under Linux but found nothing, if anything the card uses less power and is fan less so I was to benefit from it either way) Anyway, using either set up there were 5 solutions available: Have 3 separate X instances, all un joined Have 3 separate X instances, adjoined by Xinerama Have 2 separate X instances - One using twin-view, both adjoined by Xinerama Have 2 separate X instances - One using twin-view but no Xinerama Have a single Twin-view setup and leave the 3rd screen unplugged :-p The 4rd option, using 2 separate X instances and twinview (but no xinerama) was the best balance in terms of performance and usability but caused 2 really annoying issues You couldn't control (without altering the shortcuts) which screen an application opened onto - and once it was opened you couldn't move it to another screen without opening up terminal and forcing it to move Nvidia's overriding or falsifying of Xinerama breaks and the 2 screens joined by Twin view behave like a single huge screen causing popups to open in the middle of both screens and maximising of windows stretches to the width of the first 2 screens Firefox can only run one instance as the same user so having multiple firefox windows requires at least 2 users The second option "feels" like the right option, but OpenGL is basically disabled and playing any sort of game or even running anything graphical causes a huge performance drop and instability - even trying to run a basic emulator for gba or gens just causes the system to fall over. It works just enough to stare at your desktop and do nothing but as soon as you start doing some work - opening windows, dragging things around - running multiple copies of firefox it just really feels slow. The last open, only going dual screen works perfectly and everything performs as required, full GPU acceleration - two logical screen spaces - perfect, just make it work across GPUs like windows! :-p Anyway, I know RandR was supposed to pick up the slack when it would introduced GPU objects of sorts to allow multiple GPUs to be stitched together to create one huge desktop at a much deeper layer than Xinerama. I was wondering if this has now been fixed (I noticed X server 1.7 is out) and whether anyone has got it running successfully? Again, my requirements are: One huge desktop to drag any window across Maximising of windows to each screen (as XP does) Running fullscreen apps on the primary screen and disabling the mouse from moving onto the others or on all 3 stretched Finally as a side note; I am aware of the Matrox triple (and dual) head splitter but even the price they go for on eBay is more than I can afford atm, my argument: I shouldn't have to buy extra hardware to get something to work on Linux when it's something that's existed in the windows world for a long time (can you tell I don't get on with X :-p); If I had the cash I'd have bought the latest version of this box already (the new version finally supports large resolutions as the displays I have 1680x1050 each).

    Read the article

  • How to find the process(es) which are hogging the machine

    - by Aaron Digulla
    Scenario: All of a sudden, my computer feels sluggish. Mouse moves but windows take ages to open, etc. uptime says the load is 7.69 and raising. What is the fastest way to find out which process(es) are the cause of the load? Now, "top" and similar tools isn't the answer because they either show CPU or memory usage but not both at the same time. What I need is the single command which I might be able to type as it happens - something that will figure out any of System is trying to swap 8GB of RAM to disk because process X ... or process X seeks all over the disk or process X uses 400% CPU" So what I'm looking for is iostat, htop/atop and similar tools run into one with an output like this: 1235 cp - Disk trashing 87 chrome - Uses 2&nbsp;GB of RAM 137 nfs_bench - Uses 95% of the network bandwidth I don't want a tool that gives me some numbers which I can analyze but a tool that tells me exactly which process causes the current load. Assume that the user in front of the keyboard barely knows how to write "process", but the user is quickly overwhelmed when it comes to "resident size", "virtual memory" or "process life cycle". My argument goes like this: A user notices a problem. There can be thousands of reasons ... well, almost :-) The user wants to know the source of the problem. The current solutions give me lots of numbers, and I need to know what these numbers mean. What I'm looking for is a meta tool. 99% of the data is irrelevant to the problem. So what the tool should do is look for processes which hog some resource and list only those along with "this process needs a lot of CPU, this produces many IRQs, this process allocates a lot of RAM (and it's still growing)". This will be a relatively short list. It will be much more simple for someone new to this to locate the culprit from this list than from the output of, say, htop which gives me about 5000 numbers but requires me to fold multi-threaded processes myself (I have 50 lines which say VIRT 2750M but only 16 GB of RAM - the machine ought to swap itself to death but of course, this is a misinterpretation of the data that can happen quickly).

    Read the article

  • Operating system for visualization app in 6 monitors

    - by Federico
    Hi. I have to plan the development for an application with these major requirements: Show different graphical data and animations in 6 monitors, in fullscreen mode. The hardware to be used is a PC with 3 NVIDIA GeForce 9800 GX2 cards. I have some expertise working with OpenGL, but never with more than one monitor. I have the (some limited) freedom to choose an operating system for the application. My options are: Windows XP, Windows Vista, Windows 7, Ubuntu 8.04/10.04. I would like to know, if you have some expertise or knowledge in the multi-monitor application development field, what is the recommended operating system for this kind of application? And, do I need any software other than the operating system and the NVIDIA drivers to be able to use the 6 monitors in fullscreen, showing different things in each one of them? Any comment/answer will be really appreciated. Thanks in advance! Federico

    Read the article

  • Trouble in ActiveX multi-thread invoke javascript callback routine

    - by code0tt
    everyone. I'm get some trouble in ActiveX programming with ATL. I try to make a activex which can async-download files from http server to local folder and after download it will invoke javascript callback function. My solution: run a thread M to monitor download thread D, when D is finish the job, M is going to terminal themself and invoke IDispatch inferface to call javascript function. **************** THERE IS MY CODE: **************** /* javascript code */ funciton download() { var xfm = new ActiveXObject("XFileMngr.FileManager.1"); xfm.download( 'http://somedomain/somefile','localdev:\\folder\localfile',function(msg){alert(msg);}); } /* C++ code */ // main routine STDMETHODIMP CFileManager::download(BSTR url, BSTR local, VARIANT scriptCallback) { CString csURL(url); CString csLocal(local); CAsyncDownload download; download.Download(this, csURL, csLocal, scriptCallback); return S_OK; } // parts of CAsyncDownload.h typedef struct tagThreadData { CAsyncDownload* pThis; } THREAD_DATA, *LPTHREAD_DATA; class CAsyncDownload : public IBindStatusCallback { private: LPUNKNOWN pcaller; CString csRemoteFile; CString csLocalFile; CComPtr<IDispatch> spCallback; public: void onDone(HRESULT hr); HRESULT Download(LPUNKNOWN caller, CString& csRemote, CString& csLocal, VARIANT callback); static DWORD __stdcall ThreadProc(void* param); }; // parts of CAsyncDownload.cpp void CAsyncDownload::onDone(HRESULT hr) { if(spCallback) { TRACE(TEXT("invoke callback function\n")); CComVariant vParams[1]; vParams[0] = "callback is working!"; DISPPARAMS params = { vParams, NULL, 1, 0 }; HRESULT hr = spCallback->Invoke(0, IID_NULL, LOCALE_USER_DEFAULT, DISPATCH_METHOD, &params, NULL, NULL, NULL); if(FAILED(hr)) { CString csBuffer; csBuffer.Format(TEXT("invoke failed, result value: %d \n"),hr); TRACE(csBuffer); }else { TRACE(TEXT("invoke was successful\n")); } } } HRESULT CAsyncDownload::Download(LPUNKNOWN caller, CString& csRemote, CString& csLocal, VARIANT callback) { CoInitializeEx(NULL, COINIT_MULTITHREADED); csRemoteFile = csRemote; csLocalFile = csLocal; pcaller = caller; switch(callback.vt){ case VT_DISPATCH: case VT_VARIANT:{ spCallback = callback.pdispVal; } break; default:{ spCallback = NULL; } } LPTHREAD_DATA pData = new THREAD_DATA; pData->pThis = this; // create monitor thread M HANDLE hThread = CreateThread(NULL, 0, ThreadProc, (void*)(pData), 0, NULL); if(!hThread) { delete pData; return HRESULT_FROM_WIN32(GetLastError()); } WaitForSingleObject(hThread, INFINITE); CloseHandle(hThread); CoUninitialize(); return S_OK; } DWORD __stdcall CAsyncDownload::ThreadProc(void* param) { LPTHREAD_DATA pData = (LPTHREAD_DATA)param; // here, we will create http download thread D // when download job is finish, call onDone method; pData->pThis->onDone(S_OK); delete pData; return 0; } **************** CODE FINISH **************** OK, above is parts of my source code, if I call onDone method in sub-thread, I will get OLE ERROR(-2147418113 (8000FFFF) Catastrophic failure.). Did I miss something? please help me to figure it out.

    Read the article

  • ATI Radeon HD with Catalyst driver stuck mirroring screens

    - by Mike Axiak
    In 11.10 I replaced my aging Nvidia card with a new Radeon HD 6970 card. The single card has two DVI output ports which I've connected to two monitors. I installed Catalyst version 11.9 and I cannot get multiple monitors set up the way I want. I tried: $ sudo amdcccle and setting the mode to single desktop multiple monitors and whenever I do that Unity crashes and I get back to the login screen. Nothing shows up in the Xorg.*.log files for me to post here. There's only one card so I don't think xinerama would be any help here. Anyone have any ideas? EDIT: Here's my xorg.conf file: Section "ServerLayout" Identifier "aticonfig Layout" Screen 0 "aticonfig-Screen[0]-0" 0 0 EndSection Section "Module" EndSection Section "Monitor" Identifier "aticonfig-Monitor[0]-0" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" EndSection Section "Monitor" Identifier "0-DFP3" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" Option "PreferredMode" "1280x1024" Option "TargetRefresh" "60" Option "Position" "0 0" Option "Rotate" "normal" Option "Disable" "false" EndSection Section "Monitor" Identifier "0-CRT1" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" Option "PreferredMode" "1280x1024" Option "TargetRefresh" "75" Option "Position" "0 0" Option "Rotate" "normal" Option "Disable" "false" EndSection Section "Device" Identifier "aticonfig-Device[0]-0" Driver "fglrx" Option "Monitor-DFP3" "0-DFP3" Option "Monitor-CRT1" "0-CRT1" BusID "PCI:5:0:0" EndSection Section "Device" Identifier "amdcccle-Device[5]-1" Driver "fglrx" Option "Monitor-DFP3" "0-DFP3" BusID "PCI:5:0:0" Screen 1 EndSection Section "Screen" Identifier "aticonfig-Screen[0]-0" Device "aticonfig-Device[0]-0" DefaultDepth 24 SubSection "Display" EndSubSection EndSection Section "Screen" Identifier "amdcccle-Screen[5]-1" Device "amdcccle-Device[5]-1" DefaultDepth 24 SubSection "Display" Viewport 0 0 Depth 24 EndSubSection EndSection

    Read the article

  • How to handle multi-processing of libraries which already spawn sub-processes?

    - by exhuma
    I am having some trouble coming up with a good solution to limit sub-processes in a script which uses a multi-processed library and the script itself is also multi-processed. Both, the library and script are modifiable by us. I believe the question is more about design than actual code, but for what it's worth, it's written in Python. The goal of the library is to hide implementation details of various internet routers. For that reason, the library has a "Proxy" factory method which takes the IP of a router as parameter. The factory then probes the device using a set of possible proxies. Usually, there is one proxy which immediately knows that is is able to send commands to this device. All others usually take some time to return (given a timeout). One thought was already to simply query the device for an identifier, and then select the proper proxy using that, but in order to do so, you would already need to know how to query the device. Abstracting this knowledge is one of the main purposes of the library, so that becomes a little bit of a "circular-requirement"/deadlock: To connect to a device, you need to know what proxy to use, and to know what proxy to create, you need to connect to a device. So probing the device is - as we can see - the best solution so far, apart from keeping a lookup-table somewhere. The library currently kills all remaining processes once a valid proxy has been found. And yes, there is always only one good proxy per device. Currently there are about 12 proxies. So if one create a proxy instance using the factory, 12 sub-processes are spawned. So far, this has been really useful and worked very well. But recently someone else wanted to use this library to "broadcast" a command to all devices. So he took the library, and wrote his own multi-processed script. This obviously spawned 12 * n processes where n is the number of IPs to which he broadcasted. This has given us two problems: The host on which the command was executed slowed down to a near halt. Aborting the script with CTRL+C ground the system to a total halt. Not even the hardware console responded anymore! This may be due to some Python strangeness which still needs to be investigated. Maybe related to http://bugs.python.org/issue8296 The big underlying question, is how to design a library which does multi-processing, so other applications which use this library and want to be multi-processed themselves do not run into system limitations. My first thought was to require a pool to be passed to the library, and execute all tasks in that pool. In that way, the person using the library has control over the usage of system resources. But my gut tells me that there must be a better solution. Disclaimer: My experience with multiprocessing is fairly limited. I have implemented a few straightforward which did not require access control to resources. So I have not yet any practical experience with semaphores or mutexes. p.s.: In the future, we may have enough information to do this without the probing. But the database which would contain the proper information is not yet operational. Also, the design about multiprocessing a multiprocessed library intrigues me :)

    Read the article

  • Maximizing after moving RDC window between different size monitors

    - by msorens
    My Win7 system has two monitors of different sizes. When I open a Remote Desktop Connection on one monitor set to use full screen, both the RDC window and the remote system's desktop fills the monitor. If I then move the window onto my second monitor (1-Restore Down button to make it movable; 2-Drag window to other monitor; 3-Maximize button to fill monitor) the RDC window fills the monitor, but the remote system's desktop remains the same size it was before. Thus, if I move from the larger to the smaller monitor I have scrollbars to see the whole remote desktop, while if I move from the smaller to the larger monitor the remote desktop occupies only a portion of the monitor. My workaround is to close the RDC window completely then re-establish it on the other monitor. Is there a way to avoid this overhead and just resize the remote desktop to fit?

    Read the article

  • Is there a way to force lubuntu screen resolution?

    - by za1
    I have a Dell Inspiron 4000, 900 Mhz Pentium III, 192 MB RAM,10.1 GB disk running Lubuntu 12.04. When I start the computer, and I check the display resolution, Monitor Settings claims that the max resolution is 1024x768. I then connected another monitor I had lying around, rebooted, and now the laptop monitor looks normal,and monitor settings claims that 1280x1024 is the max resolution (it is) and the other monitor doesn't turn on. (black screen) The external monitor is not broken. Is there a way to, without the external monitor, force the laptop monitor resolution to 1280x1024? (The following is another question) - How can you run commands automatically at startup? Specifically "xinput 18 118 0" (without quotes) The first question is important, I can live with typing the second one at every boot. Thanks

    Read the article

  • Laptop Asus P50IJ with Intel 4500M GMA output going to a Dell 1907FP external monitor will not allow

    - by ProfessionalAmateur
    Hello - I just purchased an Asus P50IJ-X2 laptop which has a Intel GMA 4500M video card running Windows7. At work I output this laptop to a Dell 1907FP LCD which has a maximum resolution of 1280x1024. Not matter what I do the Windows will not allow the laptop to set a resolution higher than 1024x768 to this LCD monitor. Ive even gone to the extent of downloading PowerStrip (I'd post a link but Im new and can only enter 1 url, if you google for powerstrip its the first option) to create a custom driver for my monitor thinking Windows was having a hard time seeing the available resolutions it would accept. However, powerstrip read the registery and properly sees the monitor and what its capable of so Im now at a complete loss as to why Windows7 will not allow me to set/use a 1280x1024 resolution for this external monitor (as my last laptop did running Vista). The Intel documentation (http://software.intel.com/en-us/articles/quick-reference-guide-to-intel-integrated-graphics/) indicates that the GMA 4500M should be able to run up to a 2560x1600 max res. The Dell 1907FP specification states it can run up to a 1280x1024 res. But no matter what the computer will not allow me to set higher than a 1024x768. I'm completely baffled but I would really like to be able to output this laptop to a reasonable resolution, 1024x768 makes me feel like I'm using my mom's computer. Any help would be greatly appreciated! Here are some attached images (I apologize for the links, being new I cannot post images) that should help explain this better: Image 1 - This image is from powerstrip which shows the monitors max accepted resolution and at the top right the max res my PC currently allows. (http://imgur.com/agrno.png) Image 2 - This shows my Windows7 resolution picker. (http://imgur.com/3nv6q.png) Image 3 - The 'List all modes' option taken from the Screen Resolution Advanced Settings List All Modes. (http://imgur.com/AMREh.png) Image 4 - Monitor information from registry read by powerstrip, this shows the laptop is able to read the necessary info from the LCD monitor. (http://imgur.com/hUX4D.png)

    Read the article

  • There's a stray current flowing from my monitor through the VGA cable to the PC. Is this safe?

    - by EApubs
    I have two monitors in my machine. One is an old LCS samsung monitor. Recently, I started to hear a small hum in my speakers (subwoofer) and replaced them. The new one also got the issues then I found our that its a grounding issue. I unplugged the PC's power chord. The monitor is still switched on. When I checked, there's current in the earth pin (ground pin). When I unplug the monitor, there's no current and the speaker is normal. Now, I have moved that monitor to my dad's machine and took his monitor. My question is, is it a big issues? The house's earthing system is working and its grounding the current. I won't feel it if I touch the machine like in many other cases. But still, is it good to keep that monitor attached to my machine? Can it harm the computer? What should I do?

    Read the article

  • 2560 x 1600 screen resolution not available when a second monitor is attached.

    - by sgmoore
    I am running Windows 7 (64-bit edition) and have a 30" Dell 3007WFP monitor which runs at a screen resolution of 2560 x 1600. This works perfectly until I try to connect a second monitor, and then the screen resolution on the main monitor immediately drops to 1280x800 and I can't change it back up to the correct resolution until I disconnect the second monitor. The graphics card is a Nvidia Quadro FX 370. This has a dual link DVI connector (to which the 30" is connected) and a single link DVI connector. The second monitor can run at 1920x1080 and is connected using a VGA to DVI connector. Note, it does not seem to matter whether the second monitor is running at 1920x1080 or even at 800x600. Windows reports Total Available Graphics Memory: 3839MB Dedicated Video Memory: 256MB System Video Memory: 0MB Shared System Memory: 3583MB Does anyone know if this a limitation with the video card, memory, drivers, connectors or something else? If this is a limitation with the video card, can anyone recommend a PCI Express 16 card that would support at least this setup, but preferably support two 30" monitors both running 2560 x 1600. (I'm not into gaming etc, so it doesn't need to be very powerful)

    Read the article

  • How to make firefox to spellcheck in multiple languages simultaneously?

    - by Vi
    I want to it to assume that text may be in mixture of languages and words should be looked up in multiple dictionaries. (E.g. everything in en-GB, en-US, ru, be and be-classic should be consider as good, everything else should be underlined and corrections from all dictionaries should be offered). Is there an add-on for "multi-language spell-check"? Alternatively, can I merge all dictionaries into one big combined dictionary?

    Read the article

  • How to limit a process to a single CPU core?

    - by Jonathan
    How do you limit a single process program run in a Windows environment to run only on a single CPU on a multi-core machine? Is it the same between a windowed program and a command line program? UPDATE: Reason for doing this: benchmarking various programming languages aspects I need something that would work from the very start of the process, therefore @akseli's answer, although great for other cases, doesn't solve my case

    Read the article

  • WCF Windows Service Monitor and process emails

    - by acadia
    Hello, I need your suggestions in solving this issue. Here is the requirement. We have a Microsoft Exchange server and we have a service email account [email protected]. We have scanners all owner the company when a user scans a document and email is sent to [email protected] as attachment. Now I need to write a Windows service which needs to monitor that email account and whenever an email is received, read the attachement and store it in the database. My question is, is it possible to do something of this sort? Any suggestions greatly appreciated. Thanks

    Read the article

  • Forcing a mixed ISO-8859-1 and UTF-8 multi-line string into UTF-8

    - by knorv
    Consider the following problem: A multi-line string $junk contains some lines which are encoded in UTF-8 and some in ISO-8859-1. I don't know a priori which lines are in which encoding, so heuristics will be needed. I want to turn $junk into pure UTF-8 with proper re-encoding of the ISO-8859-1 lines. Also, in the event of errors in the processing I want to provide a "best effort result" rather than throwing an error. My current attempt looks like this: $junk = &force_utf8($junk); sub force_utf8 { my $input = shift; my $output = ''; foreach my $line (split(/\n/, $input)) { if (utf8::valid($line)) { utf8::decode($line); } $output .= "$line\n"; } return $output; } While this appears to work I'm certain this is not the optimal solution. How would you improve my force_utf8(...) sub?

    Read the article

  • Way to Remove Invite Limit on FBML Multi-Friend Selector

    - by David
    Hi there, I tried to look through various resources before posting here, but was having a surprisingly difficult time finding an answer to my question. Sorry in advance if I overlooked it. I'm currently trying to add the FBML Multi-Friend Selector to my Facebook page. It has a limit on the number of friends you can invite at a time ("Add up to 20 of your friends by clicking on their pictures below"). From what I've looked through it sounds like 20 is the max number of friends a user can invite, but then looking at Mint's page, they have a 22 max invite (http://www.facebook.com/mint?ref=ts) I thought it might be based on number of page fans, as Mint has 56,000, but that doesn't seem to be the case as this page only has 256 fans and have a max of 26 friend invites (http://www.facebook.com/tivix?v=app_106437999388442). Therefore, I don't really understand how this system works. Is there a way for me to increase to 26? Unlimited? Thanks for your help!

    Read the article

  • Multi page forms on ASP.NET MVC

    - by Jay
    Hi, I have decided to use ASP.NET MVC to develop multi page (registration) forms in asp.net. There will be two buttons on each page that allows the user to navigate to the previous and next page. When the user navigates back to a page they recently filled out, the data should be displayed to them. I understand ASP.NET MVC should remain stateless but how should I maintain page information when the user navigates back and forth. Should I? Save the information to a database and retrieve information for each page change? save information to the session? Load all the fields and display only whats's needed with javascript? This registration form is going to be used in multiple sites but with different sets of questions (Some may be the same). IF performance is a main concern, should I avoid generating these forms dynamically? Jay

    Read the article

  • JavaScript Multi-Dimensional Arrays

    - by JasonS
    This wasn't the question I was going to ask but I have unexpectedly run aground with JavaScript arrays. I come from a PHP background and after looking at a few websites I am none the wiser. I am trying to create a multi-dimensional array. var photos = new Array; var a = 0; $("#photos img").each(function(i) { photos[a]["url"] = this.src; photos[a]["caption"] = this.alt; photos[a]["background"] = this.css('background-color'); a++; }); Error message: photos[a] is undefined. How do I do this? Thanks.

    Read the article

< Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >