Search Results

Search found 6912 results on 277 pages for 'assembly resolution'.

Page 25/277 | < Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >

  • Change screen resolution from terminal

    - by Keith
    When I enabled my Nvidia graphics card, it changed the resolution of my screen to larger than my screen. As the result I cannot access any functions that were previously on the right side of my screen. How do I undo this? I originally had 8.04 and was able to change the resolution to whatever I wanted. I'm a new user and can copy and paste commands from a post to terminal mode but that's about it. I have no idea what they are or what they mean.

    Read the article

  • how can i change display resolution? (GMA 900)

    - by Paulo.woo
    how can i change display resolution with GMA 900 and ubuntu 11.10? i wonder how can i set up graphic driver on ubuntu system. just installed ubuntu 11.10 on my desktop (hewlet packard dc7100 usdt) i checked system configuration it seems ubuntu doesn't recognize my graphic chip set. i can change resolution only 800x600 and 1024x768. i want to change it 1280X800. is there anyone can help me! please!!! :) thanks for your help!!

    Read the article

  • Windows screen resolution change: lengthen timeout

    - by Jonathan J
    Occasionally I need to change the screen resolution of the console of a Windows virtual machine using XenCenter. When you attempt this change, Windows will revert if you don't accept the changed resolution within 15 seconds of clicking 'apply.' (Normally, this is a good thing.) The problem is that if I have a slow connection between XenCenter and the Xen hypervisor, the virtual console display may not refresh quickly enough to allow me to respond within 15 seconds. As a result, I can't change the resolution, even though the new resolution is valid. Is there a way to increase the timeout before reverting to the original display resolution?

    Read the article

  • Force Xorg to run in specific resolution

    - by Z9iT
    From my past experience (using Win-Xp), this particular monitor works only on 60Hz , Best resolution being 1024*768. I have "installed n configured" Ubuntu 12.04 Minimal (on USB stick) so that most of the time terminal is used, however whenever theirs a need to enter GUI I may issue startx command to go into gnome. However the problem is that on this particular system, issuing this command poses problem for its default refresh rate don't synchronize with the monitor. The display keep on flickering and utterly unreadable. It is visible that gnome has been loaded and default wallpaper and desktop items are visible. But the problem is due to refresh rate different than 60Hz. I am looking for a command attribute to startx command which will force the refresh rate to 60Hz and resolution preferably to 1204*768

    Read the article

  • What is the significance of these different width, height and resolution parameters?

    - by ??????? ???????????
    An image with a pixel resolution of 640 x 480 has additional dimension and resolution parameters according to exiftool. I'm unsure what they mean. Why are the X / Y Resolution parameters the same?Should they not reflect the pixel dimensions of the image? What does Exif Image Size mean and how is it different from the pixel dimensions? What is the focal plane? Does it have any relation to the device used to capture this image? $ exiftool evil1.jpg | egrep 'Width|Height|Resolution' X Resolution : 180 Y Resolution : 180 Resolution Unit : inches Exif Image Width : 400 Exif Image Height : 300 Focal Plane X Resolution : 8114.285714 Focal Plane Y Resolution : 8114.285714 Focal Plane Resolution Unit : inches Image Width : 640 Image Height : 480 If needed, the original image can be obtained from: here=http://www.pythonchallenge.com/pc/return/evil1.jpg wget --user=huge --password=file $here

    Read the article

  • .xprofile isn't enough

    - by BrianXP7
    I've been trying to find a solution on how to change the screen resolution on the login screen and other users. .xprofile will only affect my account. I've been searching for a few months but I got nothing. Please help. It would be easier if the "Default Resolution" was still there in the monitors settings. Plus, I'm afraid of editing the xorg.conf. Last time was ugly... My specs: HP dc5000 Small Form Factor Ubuntu 11.10 Oneiric Ocelot i386 Intel Pentium 4 2.40 GHz Intel 82865g x86/MMX/SSE2 Integrated Graphics (Standard Experience, OpenGL 1.3 and Runs Unity 3D :P) SoundMax Integrated Audio Card Broadcomm Integrated Ethernet 994.1 MiB RAM 38.3 GB HDD Acer X203H (Maximum Resolution 1600x900)

    Read the article

  • Screen Resolution Change

    - by user75997
    Good Evening members 1) Recently i have installed the plymouth manager and i accidentally set my resolution to 1024X768-24 which does not fit my laptop exactly and everytime my system reboots an error message shows that error: incorrect settings 1024X768-24. I tried to change the resolution from the system seetings Display but now only this 1024X768 is shown in the drop down. Kindly help me to reset the display properties to default. 2) Additionally plymouth is also not responding when i click on it and i want to uninstall it kindly help me in that too. Thank You K.Arun Kumar

    Read the article

  • Nvidia x server setting no specific option

    - by WiiTold
    I just freshly installed Ubuntu 14.04 and did only 4 things: 1) $ sudo add-apt-repository ppa:xorg-edgers/ppa $ sudo apt-get update $ sudo apt-get install nvidia-340 2) sudo apt-get install nvidia-current-updates nvidia-settings-updates 3) Went to Software & Updates/Additional drivers and chosen Using NVIDIA binary driver - version 340.32 from nvidia-340 (open source) I had to do part 3 because after part 1 I had driver version 304 Now to the main part. I cant set up custom resolution. When I had Ubuntu 12.04 year ago there was option in Nvidia x server setting called "Add custom resolution" or something like that and it was alright. Now this option is gone. How can I change/add custom resolution?

    Read the article

  • Azure git deployment - missing references in 2nd assembly

    - by Dan
    I'm trying to setup Bitbucket deployment to an Azure website. I successfully have Bitbucket and Azure linked, but when I push to Bitbucket, I get the following error on the Azure site: If I click on 'View Log', it shows the following compile errors: D:\Windows\Microsoft.NET\Framework\v4.0.30319\Microsoft.Common.targets(1578,5): warning MSB3245: Could not resolve this reference. Could not locate the assembly "System.Web.Mvc, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL". Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors. [C:\DWASFiles\Sites\<projname>\VirtualDirectory0\site\repository\<projname>.Common\<projname>.Common.csproj] D:\Windows\Microsoft.NET\Framework\v4.0.30319\Microsoft.Common.targets(1578,5): warning MSB3245: Could not resolve this reference. Could not locate the assembly "WebMatrix.WebData, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL". Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors. [C:\DWASFiles\Sites\<projname>\VirtualDirectory0\site\repository\<projname>.Common\<projname>.Common.csproj] CustomMembershipProvider.cs(5,7): error CS0246: The type or namespace name 'WebMatrix' could not be found (are you missing a using directive or an assembly reference?) [C:\DWASFiles\Sites\<projname>\VirtualDirectory0\site\repository\<projname>.Common\<projname>.Common.csproj] CustomMembershipProvider.cs(9,38): error CS0246: The type or namespace name 'ExtendedMembershipProvider' could not be found (are you missing a using directive or an assembly reference?) [C:\DWASFiles\Sites\<projname>\VirtualDirectory0\site\repository\<projname>.Common\<projname>.Common.csproj] Models\AccountModels.cs(3,18): error CS0234: The type or namespace name 'Mvc' does not exist in the namespace 'System.Web' (are you missing an assembly reference?) [C:\DWASFiles\Sites\<projname>\VirtualDirectory0\site\repository\<projname>.Common\<projname>.Common.csproj] CustomMembershipProvider.cs(198,37): error CS0246: The type or namespace name 'OAuthAccountData' could not be found (are you missing a using directive or an assembly reference?) [C:\DWASFiles\Sites\<projname>\VirtualDirectory0\site\repository\<projname>.Common\<projname>.Common.csproj] Models\AccountModels.cs(40,10): error CS0246: The type or namespace name 'Compare' could not be found (are you missing a using directive or an assembly reference?) [C:\DWASFiles\Sites\<projname>\VirtualDirectory0\site\repository\<projname>.Common\<projname>.Common.csproj] Models\AccountModels.cs(40,10): error CS0246: The type or namespace name 'CompareAttribute' could not be found (are you missing a using directive or an assembly reference?) [C:\DWASFiles\Sites\<projname>\VirtualDirectory0\site\repository\<projname>.Common\<projname>.Common.csproj] Models\AccountModels.cs(73,10): error CS0246: The type or namespace name 'Compare' could not be found (are you missing a using directive or an assembly reference?) [C:\DWASFiles\Sites\<projname>\VirtualDirectory0\site\repository\<projname>.Common\<projname>.Common.csproj] Models\AccountModels.cs(73,10): error CS0246: The type or namespace name 'CompareAttribute' could not be found (are you missing a using directive or an assembly reference?) [C:\DWASFiles\Sites\<projname>\VirtualDirectory0\site\repository\<projname>.Common\<projname>.Common.csproj] Note that these compile errors are against another assembly in my project (the assembly where I put the business logic). When Googling, the only mention I found was about having to set the "local copy" flag to true for those references. I've tried this, but still got the same errors. This all compiles fine locally. Any ideas?

    Read the article

  • InvalidCastException when creating an instance using assembly.CreateInstance

    - by Yossi Dahan
    I'm looking for an explanation for the following - I have an assembly I'm loading using Assembly assembly = Assembly.LoadFrom(filename); I then loop on all the types in the assembly, and wish to try and find out if a type implements a particular interface and if so I want an instance of that type, I've tried several things which did not work, but when I fell back to the most basic (and probably inefficient) way, I realised there's something more fundamental I don't understand - foreach (Type t in assembly.GetTypes()) { foreach (Type i in t.GetInterfaces()) { if (i.FullName == pluginInterfaceType.FullName) { object o = assembly.CreateInstance(t.ToString()); IInterface plugin = (IInterface)o; That last line causes an InvalidCastException, despite the fact that the type created definitely implements that interface. Further more - if I use Activator.CreateInstance instead of Assembly.CreateInstance (which I don't want to do), casting to the interface works just fine.

    Read the article

  • Relationship between .NET ClassLoader and Assembly

    - by smwikipedia
    I am wondering about the relationship between .NET ClassLoader and Assembly I use the "!dumpdomain xxxx" command and got the following output: Domain 1: 00522108 LowFrequencyHeap: 0052212c HighFrequencyHeap: 00522178 StubHeap: 005221c4 Stage: OPEN SecurityDescriptor: 00523430 Name: BoxUnbox.exe **Assembly**: 0056eb88 [C:\Windows\assembly\GAC_32\mscorlib\2.0.0.0__b77a5c561934e089\mscorlib.dll] **ClassLoader**: 0056ec08 SecurityDescriptor: 0056c078 Module Name 56d71000 C:\Windows\assembly\GAC_32\mscorlib\2.0.0.0__b77a5c561934e089\mscorlib.dll **Assembly**: 005794f0 [] **ClassLoader**: 00579570 SecurityDescriptor: 0057a018 Module Name 00152c5c I noticed that the "ClassLoader" and "Assembly" appear in pair. It seems each Assembly is accompanied with its own "ClassLoader". Why like this? Is this a litte noisy? Why not just assign one "ClassLoader" to a AppDomain and use it to load all the used Assembly into the AppDomain? Isnt't this design more elegant? Many thanks.

    Read the article

  • What should be the image resolution for Nexus One or Droid?

    - by sunil
    Hi, As Android supports multiple devices from different manufacturers there are different screen resolutions supported. The table that is available at http://developer.android.com/intl/fr/guide/practices/screens_support.... is not very clear to me. It shows WVGA and FWVGA in MDPI for Large Screens and HDPI for Normal screens. So, if the image is kept in drawable-mdpi and its resolution is 320 * 480 then which image will be taken by Large Screens device of MDPI. Moreover, there are two screen resolutions for HDPI i.e. 480 * 800 and 480 * 854. So, with what screen resolution the image should be built. I want to place the background image which looks distorted in WVGA emulator since its resolution is 320 * 480. I have read about nine patchable images but I think they are better for button images and edittext images so that they can stretch according to the data in it. Can someone please guide me in this? Regards Sunil

    Read the article

  • How to make a resolution independent camera preview on android?

    - by histeria
    Hi all, I'm making an android 1.6 app that uses phone's camera. In order to do this app resolution independent, I need to set a compatible aspect ratio for previewing camera on a SurfaceLayout. In 1.6 sdk there is no way to get supported sizes for the camera preview. It is possible to use a 4:3 or 3:2 aspect ratio and get no errors whith that? On the other hand, I need a way to make a xml layout that represents this Surfacelayout in this (unknown) aspect ratio in every resolution. I assume that is not possible to change the SurfaceLayout size in runtime. Can I do it with "dp" units? The other way is making this layout programmatically? There are some apps like Vignette or android camera application with some tricks to make something like that, like black bars (vignette) or fixed buttons bar, but I don't know how to do it in any kind of resolution. Any ideas? Thanks!

    Read the article

  • Linux Mint reset display resolution from console

    - by wullxz
    I have a Linux Mint 13 Xfce in a VMware Workstation 8 VM and set the resolution from 800x600 to 1280x768 and now I get permanently logged out when I try to login. I knew how to get back to my old resolution back in the xorg.conf days but Linux Mint now uses xrandr which won't display any displays when running # xrandr because X is not running (of course not - I can't login over GUI). I know that there are configuration files in /etc/X11/Xsession.d/ because I configured a debian based thinclient's resolution in a file called /etc/X11/Xsession.d/91configure_display but that file doesn't exist in my Linux Mint VM. So, how do I reset my X screen resolution from console? Edit: I forgot to tell you that I can't change resolution in console: # xrandr -s 800x600 Can't open display This message appears every time I use xrandr or xrandr -s *resolution* Update: I tried what bWowk suggested: # export DISPLAY=:0.0 # xrandr -s 800x600 No protocol specified No protocol specified Can't open display :0.0 So, that doesn't work either. Isn't there a configuration file that is executed every time X starts? X is running btw - ps aux | grep X shows one process /usr/bin/X running.

    Read the article

  • How were the first compilers made?

    - by Sauron
    I always wonder this, and perhaps I need a good history lesson on programming languages. But....since most compilers nowadays are made in C......how were the very first compilers made (AKA before C) or were all the languages just interpreted. With that being said, I still don't understand how even the first assembly language was done, I understand what assembly language is......but I don't see how they got the VERY first assembly language working (like.....how did they make the first commands (like mov R21) or w/e set to the binary equivalent.

    Read the article

  • Artificial Intelligence implemented in x86 Assembly? [closed]

    - by Bigyellow Bastion
    Okay, so I decided that for my upcoming operating system, I do basically everything in x86 Assembly, using only 16-bit mode. I will need to write the software to host on it once I have something up and going, and I'll definitely post the source and VM-executable file. But as for now I'm stuck with the idea of implementing the AI code for some of the games I'm making to host on it. AI in Assembly is tedious, and sometimes almost impossible seeming, especially complex AI(I'm talking SNES Super Mario World 2: Yoshi's Island AI here, by the way, not pong AI). I was thinking that it'd be such a hassle that I'd have to bring a higher-level language to work some of this out here, like maybe C++ or C#, but I'd have to go through more work linking it into a fine binary that my OS will host, and that adds unnecessary work to the table I wanted to avoid(I don't want a complex system, I want everything as bare-bones as possible, avoiding libraries, APIs, and linkable formats for now, to make everything more directly accessible to the kernel's API).

    Read the article

  • MIPS assembly: how to declare integer values in the .data section?

    - by Barney
    I'm trying to get my feet wet with MIPS assembly language using the MARS simulator. My main problem now is how do I initialize a set of memory locations so that I can access them later via assembly language instructions? For example, I want to initialize addresses 0x1001000 - 0x10001003 with the values 0x99, 0x87, 0x23, 0x45. I think this can be done in the data declaration (.data) section of my assembly program but I'm not sure of the syntax. Is this possible? Alternatively, in the .data section, how do I specify storing the integer values in some memory location (I don't care where, but I just want to reference them somewhere). So I'm looking for the C equivalent of "int x = 20, y=30, z=90;" I know how to do that using MIPS instructions but is it possible to declare something like that in the .data section of a MIPS assembly program?

    Read the article

  • Maven 2 assembly with dependencies: jar under scope "system" not included.

    - by YuppieNetworking
    Hello, I am using maven-assembly plugin to create a jar of my application, including its dependencies as follows: <assembly> <id>macosx</id> <formats> <format>tar.gz</format> <format>dir</format> </formats> <dependencySets> <dependencySet> <includes> <include>*:jar</include> </includes> <outputDirectory>lib</outputDirectory> </dependencySet> </dependencySets> </assembly> (I omitted some other stuff that is not related to the question) So far this has worked fine because it creates a lib directory with all dependencies. However, I recently added a new dependency whose scope is system, and it does not copy it to the lib output directory. i must be missing something basic here, so I call for help. The dependency that I just added is: <dependency> <groupId>sourceforge.jchart2d</groupId> <artifactId>jchart2d</artifactId> <version>3.1.0</version> <scope>system</scope> <systemPath>${project.basedir}/external/jchart2d-3.1.0.jar</systemPath> </dependency> The only way I was able to include this dependency was by adding the following to the assembly element: <files> <file> <source>external/jchart2d-3.1.0.jar</source> <outputDirectory>lib</outputDirectory> </file> </files> However, this forces me to change the pom and the assembly file whenever this jar is renamed, if ever. Also, it seems just wrong. I have tried with <scope>runtime</scope> in the dependencySets and <include>sourceforge.jchart2d:jchart2d</include> with no luck. So how do you include a system scoped jar to your assembly file in maven 2? Thanks a lot

    Read the article

  • help understanding differences between #define, const and enum in C and C++ on assembly level.

    - by martin
    recently, i am looking into assembly codes for #define, const and enum: C codes(#define): 3 #define pi 3 4 int main(void) 5 { 6 int a,r=1; 7 a=2*pi*r; 8 return 0; 9 } assembly codes(for line 6 and 7 in c codes) generated by GCC: 6 mov $0x1, -0x4(%ebp) 7 mov -0x4(%ebp), %edx 7 mov %edx, %eax 7 add %eax, %eax 7 add %edx, %eax 7 add %eax, %eax 7 mov %eax, -0x8(%ebp) C codes(enum): 2 int main(void) 3 { 4 int a,r=1; 5 enum{pi=3}; 6 a=2*pi*r; 7 return 0; 8 } assembly codes(for line 4 and 6 in c codes) generated by GCC: 6 mov $0x1, -0x4(%ebp) 7 mov -0x4(%ebp), %edx 7 mov %edx, %eax 7 add %eax, %eax 7 add %edx, %eax 7 add %eax, %eax 7 mov %eax, -0x8(%ebp) C codes(const): 4 int main(void) 5 { 6 int a,r=1; 7 const int pi=3; 8 a=2*pi*r; 9 return 0; 10 } assembly codes(for line 7 and 8 in c codes) generated by GCC: 6 movl $0x3, -0x8(%ebp) 7 movl $0x3, -0x4(%ebp) 8 mov -0x4(%ebp), %eax 8 add %eax, %eax 8 imul -0x8(%ebp), %eax 8 mov %eax, 0xc(%ebp) i found that use #define and enum, the assembly codes are the same. The compiler use 3 add instructions to perform multiplication. However, when use const, imul instruction is used. Anyone knows the reason behind that?

    Read the article

  • Do assembly strong names change when new versions of .Net are released?

    - by Ryan Michela
    I'm trying to load an assembly that was installed as part of .Net 3.5 SP1 using Assembly.Load() by referencing its strong name. This works fine on my computer right now, but is it future proof? Will the strong name of core .Net assemblies change when patches are installed or new versions of the .Net framework are released? If so, how can I load an assembly from the GAC without using it's strong name?

    Read the article

  • Laptop Asus P50IJ with Intel 4500M GMA output going to a Dell 1907FP external monitor will not allow

    - by ProfessionalAmateur
    Hello - I just purchased an Asus P50IJ-X2 laptop which has a Intel GMA 4500M video card running Windows7. At work I output this laptop to a Dell 1907FP LCD which has a maximum resolution of 1280x1024. Not matter what I do the Windows will not allow the laptop to set a resolution higher than 1024x768 to this LCD monitor. Ive even gone to the extent of downloading PowerStrip (I'd post a link but Im new and can only enter 1 url, if you google for powerstrip its the first option) to create a custom driver for my monitor thinking Windows was having a hard time seeing the available resolutions it would accept. However, powerstrip read the registery and properly sees the monitor and what its capable of so Im now at a complete loss as to why Windows7 will not allow me to set/use a 1280x1024 resolution for this external monitor (as my last laptop did running Vista). The Intel documentation (http://software.intel.com/en-us/articles/quick-reference-guide-to-intel-integrated-graphics/) indicates that the GMA 4500M should be able to run up to a 2560x1600 max res. The Dell 1907FP specification states it can run up to a 1280x1024 res. But no matter what the computer will not allow me to set higher than a 1024x768. I'm completely baffled but I would really like to be able to output this laptop to a reasonable resolution, 1024x768 makes me feel like I'm using my mom's computer. Any help would be greatly appreciated! Here are some attached images (I apologize for the links, being new I cannot post images) that should help explain this better: Image 1 - This image is from powerstrip which shows the monitors max accepted resolution and at the top right the max res my PC currently allows. (http://imgur.com/agrno.png) Image 2 - This shows my Windows7 resolution picker. (http://imgur.com/3nv6q.png) Image 3 - The 'List all modes' option taken from the Screen Resolution Advanced Settings List All Modes. (http://imgur.com/AMREh.png) Image 4 - Monitor information from registry read by powerstrip, this shows the laptop is able to read the necessary info from the LCD monitor. (http://imgur.com/hUX4D.png)

    Read the article

  • HTML/CSS: What should I use to define image height/width to make it resolution independent?

    - by Tedy
    I've read all over the Internet that I should not define fonts (or anything) with absolute pixel height/width/size and instead, use EM ... so that on higher resolution displays, my web site can scale appropriately. However, what do I use to define IMAGE height/width ... because images won't scale well (they look pixelated) UPDATE: To clarify, I'm not referring to page zoom. I'm referring to how to make my web application resolution independent so that it will look correct on higher DPI displays.

    Read the article

  • Are there algorithms for increasing resolution of an image?

    - by David
    Are there any algorithms or tools that can increase the resolution of an image - besides just a simple zoom that makes each individual pixel in the image a little larger? I realize that such an algorithm would have to invent pixels that don't really exist in the original image, but I figured there might be some algorithm that could intelligently figure out what pixels to add to the image to increase its resolution.

    Read the article

  • Window resolution handling without javascript

    - by Sai Sasdhar
    Is there any CSS property to deal window resolution. I created a HTML page taking total resolution width as 1424px.When I open it on different resolution I see a scroll bar & resolution of HTML does not adjust automatically. I don't want the scrollbar to be seen in maximized size of the page. Is it possible without using javascript.At the same time how to increase the resolution when itself is created in lower res.

    Read the article

  • Configure 27" 2560x1440 for a monitor with corrupt EDID

    - by Aras
    I am trying to get a monitor work with my Ubuntu laptop. The monitor is this cheap 27" Korean monitors which has a 2560x1440 resolution -- and nothing else. Here are some specifications of this monitor: 2560x1440 @60Hz Only one dual link DVI-D input -- no other input port (no HDMI or display port) no OSD no scalar reports corrupt EDID does 2560x1440 @60Hz, did I say that already? Anyways, the monitor works beautifully with my Ubuntu desktop which has an nVidia card with DVI output. However, I am having problem using this monitor with my laptop. After some searching around I found a few posts suggesting to use an active adaptor for mini display port, so I went and bought a mini display to dual link DVI-D adaptor.. When using this adaptor the monitor is recognized by nvidia-settings tool but with incorrect resolution information. As you can see the monitor is incorrectly recognized and there are no other resolution available to set. This post on ubuntu forums and this other post on overclock both suggest that the monitor is reporting corrupt EDID file. I have tried following their instructions, but so far I have not been able to display any image on the monitor from my laptop. The laptop I am using is an ASUS G75VW with a 1920x1080 screen. It has a VGA, an HDMI 1.4a, and a mini display port. The graphic card is an nvidia gforce gtx 660M with 2GB dedicated memory. I am running Ubuntu 12.10 on here which I upgrade from 12.04 a few weeks ago. As I said I have tried several suggestions, including specifying Modeline in xorg.conf and also linking to EDID files I found from those forum posts above. However, I am not sure if the EDID files I found are suitable for my monitor. I think the solution to my problem consist of obtaining the EDID file of my monitor and then fixing it and modifying xorg.conf to force nvidia driver to load the correct resolution. However, I am not sure what steps I need to take to do this. Here is the part of sudo xrandr --prop output that is related to this monitor: DP-1 connected 800x600+1920+0 (normal left inverted right x axis y axis) 0mm x 0mm SignalFormat: DisplayPort supported: DisplayPort ConnectorType: DisplayPort ConnectorNumber: 3 (0x00000003) _ConnectorLocation: 3 (0x00000003) 800x600 60.3*+ I was expecting to see the EDID file in this output as was mentioned in this post, but it is not there. After several hours of tweaking X configurations, I decided it was time to ask for help here. I would really appreciate if someone with experience regarding EDID and X configuration could give me a hand to solve this issue.

    Read the article

< Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >