Search Results

Search found 6912 results on 277 pages for 'assembly resolution'.

Page 31/277 | < Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >

  • Combining multiple UIImageViews and preserve resolution

    - by plspl
    I am designing an application where a user places multiple UIImageViews one over another. When the user decides to save this to photo album, I have to combine all these UIImageViews and save it to Photo Library. While combining them I need to preserve their positions, resolutions, zorder. The approach I tried I was to have a parent UIView which acts like a container. The user places all the UIImageViews inside this UIView. While saving, I take a screenshot of the UIView and save it. Although this approach works, it does not preserve the resolution. The resolution of the final image is same as the resolution of the parent UIView size (width and height being 300 pixels). Is there way to preserve the resolution? or atleast have a higher resolution like upto 1024* 1024 pixels? Any pointers/code examples would be appreciated!

    Read the article

  • How do I re-set a BMP file's resolution (DPI) indicator?

    - by Joshua Fox
    I have a BMP tagged as 299 DPI resolution. I'd like to change that to 99 DPI. Importantly, the DPI marker in a BMP has no structural meaning. An image has a certain width and height in pixels. The displaying application can show the image at any width in inches. So, the DPI is just a hint. However, I am dealing with some third-party software which behaves differently depending on this marker, so I need to re-set it. I will appreciate suggestions on how to do this programmatically (especially in Java) as well as in GUI graphics tools (e.g. Gimp).

    Read the article

  • Sort an array via x86 Assembly (embedded in C++)?? Possible??

    - by Mark V.
    I am playing around with x86 assembly for the first time and I can't figure out how to sort an array (via insertion sort).. I understand the algorithm, but assembly is confusing me as I primarily use Java & C++. Heres all I have so far int ascending_sort( char arrayOfLetters[], int arraySize ) { char temp; __asm{ push eax push ebx push ecx push edx push esi push edi //// ??? pop edi pop esi pop edx pop ecx pop ebx pop eax } } Basically nothing :( Any ideas?? Thanks in advance.

    Read the article

  • Running a 2048 x 1152 monitor on a Dell Vostro 220s with HD3450 video card

    - by Jon
    I recently bought a new monitor which has a resolution of 2048 x 1152. However, this resolution is not on in the options list. I have tried installing ATI Catalyst but to no avail. I am running the monitor on the VGA cable. How can I add the resolution to the settings drop down? Additional info: VGA is through a splitter cable from DVI The outputs on my grahics card are DVI and VGA Monitor only has VGA input

    Read the article

  • Sharp flat panel scaling with Nvidia drivers

    - by Brecht Machiels
    I have a Samsung 226BW flat panel with a 1680x1050 native resolution. As my PC is rather dated (Athlon XP 2600+ and GeForce 6600 GT), I need to run more recent games (but still old) on a lower resolution. Unfortunately, scaling low resolutions to 1680x1050 results in a very blurry image (bilinear scaling). I have created a custom resolution of 840x525 in the Nvidia control panel. Technically, this resolutions allows perfect upscaling to 1680x1050 without the need for bilinear interpolation. Unfortunately, the Nvidia driver always seems to do bilinear scaling, again resulting in a blurry image. However, I seem to remember that I did obtain crisp images using this resolution in the past (before a Windows re-install). Maybe only some driver versions support integer upscaling without bilinear filtering? Or perhaps there are other solutions?

    Read the article

  • How to force a resolution in linux?

    - by hi
    I have a HP LP3065 that requires a dual link dvi cable to go up to 2560x1600 resolution. However, I do not have that cable, but I want to use 1920x1200 resolution. However, my nVidia Display Settings (also system display settings) will only go up to 1280x1024, which looks horribly pixelated on the 30". How do I force the 1920x1200 resolution? I tried adding the mode into my xorg.conf file, but it still would not take it. I know my vid card can do 1920x1200 since it works on the 24" monitor with the same dvi cable. Here are my specs: Fedora 12 Nvidia Quadro NVS 420 Intel Xenon E5530 cpu 6gig memory Thanks

    Read the article

  • How to get monitor resolution of 1680x1050 using ATI Radeon HD 2600 PRO in a dual monitor setup

    - by pratikk
    I have Sapphire ATI Radeon HD 2600 PRO graphics card in my Windows Server 2008 x64 machine. It has two outputs a VGA and a DVI. I have connected the DVI to my Dell 24" monitor with 1900x1200 resolution and it works 100%. The VGA I have connected to my second monitor a Samsung 22" with native resolution of 1680x1050. But the ATI driver and Catalyst control centre doesn't show this resolution as an option. If I choose a lower resolution like 1280x1024 it looks really bad and fuzzy. I searched in google and downloaded the powerstrip tool that allowed me to create a custom resolution of 1680x1050 and then this option shows up in Catalyst control centre and my 2nd monitor works fine now. But I don't want to pay for an application to choose a display resolution. Why doesn't ATI show me that option by default even though it has no problem in actually supporting the display at resolution. Is there a way to get 1680x1050 resolution using ATI drivers only ?

    Read the article

  • Monitors - inches vs resolution

    - by Vnuk
    I'm currently moving away from living five years only on laptop to a desktop setup. I'm currently browsing for monitors and I've noticed something strange. On my laptop I have 1920x1200 on 17". To get the same resolution on a monitor I have to get Dell U2410 24" or Samsung SM2443NW 24". I do not need (or want) 7" more inches of screen, I just want the 1920x1200 resolution. Why is this setup (big resolution on less inches of screen) available on a laptop but not on a regular monitor? I'm setting this as a community wiki beacuse I think that there is no right answer here...

    Read the article

  • How to Eliminate Black Bars from Powerpoint videos ?

    - by appu
    Hi, I am running a digital signage system for my client. The basic installation is a vertically oriented 42" LCD TV with a 1920x1080 screen resolution (reverse of 1920x1080 when setup normally, i.e. landscaped). Please check out the following link for a basic screen divisions layout I want to setup. http://flickr.com/photos/55097319@N03/5410208856 In the division labeled "ppt" I plan to run a powerpoint presentation. The screen division is 360x1476 resolution. As there isn't an option in powerpoint to specify slide size in terms of resolution so according to this article on indezine http://indezine.com/products/powerpoint/books/perfectmedicalpres02.html to get a screen resolution of my preference I divided 360 and 1476 each by 72 which gives me 5"x 20.5" as the slide size for my ppt. After setting up slide size as per above dimensions, I used sizer (http://brianapps.net) to resize my ppt window to 360x1476 so that when I record I do not get any black bars. But after launching recording there are side black bars visible which camtasia records and brings-inside cam studio with black bars. http://www.flickr.com/photos/55097319@N03/5409597049 My question is that after doing the above and as explained in the following video link why do I still get black bars. http://feedback.techsmith.com/techsmith/topics/eliminate_black_bars_in_your_powerpoint_recordings Is there an option in camtasia to stretch the recording to cover up black bars or any other alternative way I can get rid of those black bars while I get a ppt recorded as per my preferred dimensions. Notes: In the techsmith video above it asks to adjust my desktop screen resolution which my display chipset does not allow me to set. I set up show in powerpoint to be "browsed by an individual(window)". Also signage software only supports swf's and video file formats natively and not ppt, pptx etc. Thanks bhavani.

    Read the article

  • Secondary Monitor won't Display at Full Resolution with Primary Display Disconnected

    - by Laramie
    I have had a bad day. Among today's events, the LCD display on my Dell E6500 notebook failed. It had been working with a secondary monitor connected through a VGA cable at 1600x900 resolution until I removed the LCD from my notebook for repair. Now the max resolution available on the secondary display is 1024x768. If I reconnect the burned out primary display, the secondary monitor is again available at 1600x900. The video card is an onboard Mobil Intel 4. How can I maintain the secondary display at 1600x900 with the primary display disconnected (for repair)? Edit: I just noticed that when Windows boots, it displays at the correct resolution and I can see my desktop in all its glory for a couple seconds then reverts to 1024x768.

    Read the article

  • RDP for High DPI Monitors?

    - by Joey
    A client is having some problems with their laptop. They use RDP to remote into their work PC, but the laptop they are using is a small 13" Sony Vaio laptop, but with 1920x1080 resolution. Everything is pretty small on the laptop anyway, but the problem is much worse after connecting with RDP, where everything is almost unreadable. I have done the obvious with changing the resolution on the server, the RDP size, forced scaling on the terminal server etc, but nothing has worked. Something else which I would normally do is change the laptop resolution to something a little lower, but the laptop only has 2 resolution settings, the big one, and a 1024x768 (wrong ratio). Any ideas?

    Read the article

  • Display Resolution Getting Changed Automatically (sometimes) - Win7x64 - ASUS M3A78-EM

    - by kamleshrao
    Hi Guys, AMD PC Motherboard - ASUS M3A78-EM (installed VGA drivers from ASUS website - AMD_VGA_V863200_XPVistaWin7.zip) It has VGA and HDMI inbuilt ports (ATI Radeon HD 3200 Graphics 256MB) VGA is connected to my ViewSonic Monitor. HDMI is connected to LG LCD TV OS - Windows 7 x64 Ultimate Sometimes, when I reboot/start my PC, my ViewSonic display resolution changes and does not show the recommended value(1440x900) in Resolution Settings. The HDMI display works well always. But after multiple reboots/starts, the problem gets fixed automatically and I get back my recommended resolution on ViewSonic display. I am not making any changes to the existing drivers. It sometimes work well, and sometimes doesn't work. Can someone assist me resolving this permanently.

    Read the article

  • Wrong screen resolution with HDMI output to HDTV

    - by bruno077
    I own an HDTV with a native resolution of 1360x768. Sometimes I plug in my laptop to watch movies, and I used to do it with a VGA cable. I've had no problem setting the laptop to the TV native resolution with the VGA cable (1360x768). Having acquired an HDMI cable recently, I can't go past 1280x720, and if I do, the TV displays "invalid format". At 1280x720, the image is viewable but not centered, so for example I can't see the windows taskbar or titlebars. I've tried both cables (VGA and HDMI) with the same settings: Laptop's display turned off and trying to set the Tv's native resolution. What could I try to fix this issue? Could it be a faulty cable? (I'm using a Thinkpad with Windows 7) EDIT: My graphics adapter appears listed as Mobile Intel(R) 4 Series Express Chipset Family (Searching for my laptop's model yields Intel GMA 4500MHD as the graphic card) I'm downloading a driver update from Intel's website. I'll report my progress when I check its behavior tomorrow.

    Read the article

  • Virtual PC 2007 Full Screen Resolution

    - by Swami
    I have a laptop (1440 x 900) and an external monitor (1920 x 1200), and I'm running Virtual PC 2007. Initially, my VPC could not switch to fullscreen mode on my external monitor because the max resolution for VPC was only 1600 x 1200. I installed a hotfix (http://support.microsoft.com/kb/958162) and after the hotfix, I was able to view VPC in fullscreen mode on my external monitor, but now I'm unable to view fullscreen on my laptop. It says "please check that the resolution of the guest is not higher than that of the host". I even tried to decrease the resolution of my Virtual PC to less than than of my laptop, but it always resets it back up. So now after the hotfix, I can view fullscreen mode only when my external monitor is plugged in. Any way I can resolve this?

    Read the article

  • Monitor does not work past certain resolution.

    - by Steve Stifler
    I have a Macbook 5.1 with Snow Leopard, and a Samsung SyncMaster 2253BW that I use for an external monitor. I connect it to my laptop via a DVI-to-MiniDisplayPort adapter. Whenever I try to set the resolution of the monitor to 1680x1050, its maximum resolution (at least the only one that fills the entire screen), the screen flickers once or twice before claiming there is no source attached. I've had this issue since the first time I tried to connect the two, however for some reason for a brief period in time the monitor would successfully display at full resolution. All lower resolutions work fine too, it's just at 1680x1050 that the problems occur. Any ideas as to why this is happening? Thanks.

    Read the article

  • virtual box strectch to fit resolution

    - by Scarface
    Hey guys I have a really annoying problem, that I hope someone has figured out. I just installed ubuntu on virtual box and installed the guest additions so everything was great. I had a resolution that stretched across my screen from left to right and the only virtual box components that were visible were the windows vista title bar : minimize/exit/maximize buttons and virtual box controls at the bottom. Now all of a sudden now that I have installed the ubuntu 170mb of automatic updates, I see vertical and horizontal scroll bars that are part of virtual box and the ubuntu resolution will not stretch across my screen anymore. What I want is a ubuntu resolution that will stretch to fit the maximized window of virtual box, and remove the scroll bars. If anyone has any ideas, I would really appreciate it.

    Read the article

  • Display Resolution Getting Changed Automatically (sometimes) - Win7x64 - Asus M3A78-EM

    - by kamleshrao
    AMD PC Motherboard - Asus M3A78-EM (installed VGA drivers from Asus website - AMD_VGA_V863200_XPVistaWin7.zip) It has VGA and HDMI inbuilt ports (ATI Radeon HD 3200 Graphics 256MB) VGA is connected to my ViewSonic Monitor. HDMI is connected to LG LCD TV OS - Windows 7 x64 Ultimate Sometimes, when I reboot/start my PC, my ViewSonic display resolution changes and does not show the recommended value(1440x900) in Resolution Settings. The HDMI display works well always. But after multiple reboots/starts, the problem gets fixed automatically and I get back my recommended resolution on ViewSonic display. I am not making any changes to the existing drivers. It sometimes work well, and sometimes doesn't work. Can someone assist me resolving this permanently.

    Read the article

  • Mac webcam photo application with access to camera settings (resolution, camera selection, color balance, focus)

    - by Pascal T.
    Does anyone know about a webcam photo application (ie an alternative to photo booth) with would allow to change the settings on the camera, such as : Select camera (I.e I want to use an external webcam) Change camera resolution (with photobooth change camera settings (I.e autofocus, aperture, color balance, etc..) I did a lot of research on the internet with no success. I am looking for a very simple app (such as wmcap.exe on Windows) What I tried so far: photo booth: it works with an external camera, however there is no way to change the resolution, or the color/focus settings manycam : a virtual webcam driver. you can add special effects to your camera and transfer those effects to any app, but not change your camera settings... iGlasses : enables you to change the camera settings inside photo booth and other apps. However you cannot control the focus, nor the video resolution macam (did not work on my Mac book Pro) Does anyone know better than me? Note : my only solution now is to launch a virtual machine (with parallels desktop) and take the pictures from there!

    Read the article

  • How to get monitor resolution of 1680x1050 using ATI Radeon HD 2600 PRO in a dual monitor setup

    - by user7651
    I have Sapphire ATI Radeon HD 2600 PRO graphics card in my Windows Server 2008 x64 machine. It has two outputs a VGA and a DVI. I have connected the DVI to my Dell 24" monitor with 1900x1200 resolution and it works 100%. The VGA I have connected to my second monitor a Samsung 22" with native resolution of 1680x1050. But the ATI driver and Catalyst control centre doesn't show this resolution as an option. If I choose a lower resolution like 1280x1024 it looks really bad and fuzzy. I searched in google and downloaded the powerstrip tool that allowed me to create a custom resolution of 1680x1050 and then this option shows up in Catalyst control centre and my 2nd monitor works fine now. But I don't want to pay for an application to choose a display resolution. Why doesn't ATI show me that option by default even though it has no problem in actually supporting the display at resolution. Is there a way to get 1680x1050 resolution using ATI drivers only ?

    Read the article

  • Win 8 start screen resolution

    - by Abhijith
    My screen resolution is 1280x1024 running Win 8 RP I formatted my computer and reinstalled Win 8 CP because I had too many BSODs. When I installed Win 8 CP and created a local account. I had 5(or 6) tiles per column. But once I switched to the Microsoft account to get my synced wallpaper and lock-screen, the Start screen resolution changed and I got a max 3 tiles per column. The size of all metro apps including the Settings app changed and became awkwardly bigger. Is there a way to get back 5 tiles per column? Essentially changing the resolution of the start screen?

    Read the article

  • Issues with new graphics card and drivers

    - by Ortund
    With my onboard graphics card (Intel HD 4000) I was able to use 1680x1050 screen resolution. Using the same screen and an Asus ENGTS250 graphics card with 296.10 drivers installed, I'm not able to use that resolution anymore. Also, I've been having other trouble with driver installations. Using 340.52 or 337.88, I can't boot into Windows 7. I see the Windows Welcome screen and then the screen goes black, loses signal and the computer locks up. I think the drivers issues and the resolution problems go hand-in-hand. I have a feeling I could use a newer driver than 296.10 which would give me access to use the 1680x1050 resolution I want to use, but I'm terrified that if I update or install another driver, that the computer will lock up again. Can anyone recommend a [better] driver for me to use?

    Read the article

  • In C++, what is the scope resolution ("order of precedence") for shadowed variable names?

    - by Emile Cormier
    In C++, what is the scope resolution ("order of precedence") for shadowed variable names? I can't seem to find a concise answer online. For example: #include <iostream> int shadowed = 1; struct Foo { Foo() : shadowed(2) {} void bar(int shadowed = 3) { std::cout << shadowed << std::endl; // What does this output? { int shadowed = 4; std::cout << shadowed << std::endl; // What does this output? } } int shadowed; }; int main() { Foo().bar(); } I can't think of any other scopes where a variable might conflict. Please let me know if I missed one. What is the order of priority for all four shadow variables when inside the bar member function?

    Read the article

  • Were the first assemblers written in machine code?

    - by The111
    I am reading the book The Elements of Computing Systems: Building a Modern Computer from First Principles, which contains projects encompassing the build of a computer from boolean gates all the way to high level applications (in that order). The current project I'm working on is writing an assembler using a high level language of my choice, to translate from Hack assembly code to Hack machine code (Hack is the name of the hardware platform built in the previous chapters). Although the hardware has all been built in a simulator, I have tried to pretend that I am really constructing each level using only the tools available to me at that point in the real process. That said, it got me thinking. Using a high level language to write my assembler is certainly convenient, but for the very first assembler ever written (i.e. in history), wouldn't it need to be written in machine code, since that's all that existed at the time? And a correlated question... how about today? If a brand new CPU architecture comes out, with a brand new instruction set, and a brand new assembly syntax, how would the assembler be constructed? I'm assuming you could still use an existing high level language to generate binaries for the assembler program, since if you know the syntax of both the assembly and machine languages for your new platform, then the task of writing the assembler is really just a text analysis task and is not inherently related to that platform (i.e. needing to be written in that platform's machine language)... which is the very reason I am able to "cheat" while writing my Hack assembler in 2012, and use some preexisting high level language to help me out.

    Read the article

  • What does the ".align" x86 Assembler directive do exactly? [migrated]

    - by Sinister Clock
    I will list exactly what I do not understand, and show you the parts I can not understand as well. First off, The .Align Directive .align integer, pad. The .align directive causes the next data generated to be aligned modulo integer bytes 1.~ ? : What is implied with "causes the next data generated to be aligned modulo integer bytes?" I can surmise that the next data generated is a memory-to-register transfer, no? Modulo would imply the remainder of a division. I do not understand "to be aligned modulo integer bytes"....... What would be a remainder of a simple data declaration, and how would the next data generated being aligned by a remainder be useful? If the next data is aligned modulo, that is saying the next generated data, whatever that means exactly, is the remainder of an integer? That makes absolutely no sense. What specifically would the .align, say, .align 8 directive issued in x86 for a data byte compiled from a C char, i.e., char CHARACTER = 0; be for? Or specifically coded directly with that directive, not preliminary Assembly code after compiling C? I have debugged in Assembly and noticed that any C/C++ data declarations, like chars, ints, floats, etc. will insert the directive .align 8 to each of them, and add other directives like .bss, .zero, .globl, .text, .Letext0, .Ltext0. What are all of these directives for, or at least my main asking? I have learned a lot of the main x86 Assembly instructions, but never was introduced or pointed at all of these strange directives. How do they affect the opcodes, and are all of them necessary?

    Read the article

< Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >