Search Results

Search found 6318 results on 253 pages for 'hybrid graphics'.

Page 124/253 | < Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >

  • How should i draw text and shapes in wpf and C#?

    - by peterhu
    I want to do basic WPF graphics, i.e. rectangles, lines, circles and text. When should I use Drawing and when should I use a DrawingVisual? I have some code that uses Drawing and I render those to a DrawingImage and display that in an image control. Is this the right way? I could not see how to add text to a drawing. I had trouble positioning it too. Should I be rendering to a Canvas? I have some code that uses DrawingVisual and writes to a DrawingContext. That is like WinForms. Is this the recommended way? Do you have any high level advice on which APIs to use for basic graphics and labels? Will they options work on Silver Light and Desktop?

    Read the article

  • Is there any way I can use two monitors in the console in Linux?

    - by Alex
    I have recently become the proud owner of two monitors in my workspace. (Ok not owner, but you know what I mean) and I'd like to use both of them at once. Problem is, I much much prefer to use a Linux Server console over a desktop environment. The graphics card on the machine is a GTX295 (don't ask why, it's a long story.) so I essentially have two graphics cards. Each has a DVI output. Is there any way I can get the console to stretch across two screens? Or will I have to install a desktop Ubuntu for this to work?

    Read the article

  • what are the various approaches for generating PDFs?

    - by andthereitgoes
    I have an idea for an app that would take some flash content which contains graphics and images like various geometric shapes and polygons and some random images and convert them to PDF. Also, since I envision this app to be used my multiple users I want this process to be quick and scalable. One possible solution I could think of is have a small flash client with the capability of assembling the above mentioned graphics and images. Generate some sort of XML, send it to a server running a Java process which could render the PDF using iText. I was wondering what are the other possible ways to do it or the best practices. Technology isn't an issue; open source or commercial. I am looking for ideas to make the process fast and scalable. Most importantly, I don't want the back end server side PDF render engine to constraint the flash client capabilities. I would appreciate if you could share your tech stack idea. Thanks a lot!

    Read the article

  • Problem using GDI+ with multiple threads (VB.NET)

    - by Joe B
    I think it would be best if I just copy and pasted the code (it's very trivial). Private Sub Main() Handles MyBase.Shown timer.Interval = 10 timer.Enabled = True End Sub Private Sub Form1_Paint(ByVal sender As System.Object, ByVal e As System.Windows.Forms.PaintEventArgs) Handles MyBase.Paint e.Graphics.DrawImage(image, 0, 0) End Sub Private Sub tick() Handles timer.Elapsed Using g = Graphics.FromImage(image) g.Clear(Color.Transparent) g.DrawLine(Pens.Red, 0 + i, 0 + i, Me.Width - i, Me.Height - i) End Using Me.Invalidate() End Sub An exception, "The object is currently in use elsewhere", is raised during the tick event. Could someone tell me why this happens and how to solve it? Thanks.

    Read the article

  • Obtaining kerning information

    - by chadb
    How can I obtain kerning information for GDI to then use in GetKerningPairs? The documentation states that The number of pairs in the lpkrnpair array. If the font has more than nNumPairs kerning pairs, the function returns an error. However, I do not know how many pairs to pass in, and I don't see a way to query for it. EDIT I tried to do the following, however, it still gave me 0. Font* myFont = new Font(L"Times New Roman", 10); Bitmap* bitmap = new Bitmap(256, 256, PixelFormat32bppARGB); Graphics* g = new Graphics(bitmap); SelectObject(g->GetHDC(), myFont); //DWORD numberOfKerningPairs = GetKerningPairs( g->GetHDC(), -1, NULL ); DWORD numberOfKerningPairs = GetKerningPairs( g->GetHDC(), INT_MAX, NULL );

    Read the article

  • Resizing gives me to heavy image

    - by phenevo
    Hi, I'm resizing jpeg 1200x900 ,556kb by method: public static Image ResizeImage(Image imgToResize, int height) //height=400 { int destWidth; int destHeight; int sourceWidth = imgToResize.Width; int sourceHeight = imgToResize.Height; float nPercent = 0; float nPercentH = 0; nPercentH = ((float)height / (float)sourceHeight); nPercent = nPercentH; destWidth = (int)(sourceWidth * nPercent); destHeight = height; Bitmap b = new Bitmap(destWidth, destHeight); Graphics g = Graphics.FromImage((Image)b); g.InterpolationMode = InterpolationMode.HighQualityBicubic; g.DrawImage(imgToResize, 0, 0, destWidth, destHeight); g.Dispose(); return b; } gives me 555kb 533x400 jpeg. Why this photo is so heavy. For photo jpeg 2111kb 2156x1571 I get 556kb 533x400 jpeg Why in first case is so terrible !

    Read the article

  • Is it acceptable to design my GLSurfaceView as a main control class?

    - by Omega
    I'm trying to structure a game I'm making in Android so that I have a sound, flexible design. Right now I'm looking at where I can tie my games rules engine and graphics engine together and what should be in between them. At a glance, I've been eying my implementation of GLSurfaceView, where various screen events are captured. My rationale would be to create an instance of my game engine and graphics engine here and receive events and state changes to trigger updates of either where applicable. Further to this, in the future, the GLSurfaceView implementation could also store stubs for players during a network game and implementations of computer opponents and dispatch them appropriately. Does this seem like a sensible design? Are there any kinds of improvements I can make? Thanks for any input!

    Read the article

  • Performance implications of using a variable versus a magic number

    - by Nathan
    I'm often confused by this. I've always been taught to name numbers I use often using variables or constants, but if it reduces the efficiency of the program, should I still do it? Heres an example: private int CenterText(Font font, PrintPageEventArgs e, string text) { int recieptCenter = 125; int stringLength = Convert.ToInt32(e.Graphics.MeasureString(text, font)); return recieptCenter - stringLength / 2; } The above code is using named variables, but runs slower then this code: private int CenterText(Font font, PrintPageEventArgs e, string text) { return 125 - Convert.ToInt32(e.Graphics.MeasureString(text, font) / 2); } In this example, the difference in execution time is minimal, but what about in larger blocks of code?

    Read the article

  • Speed of interpolation algorithms, C# and C++ working together.

    - by Kaminari
    Hello. I need fast implementation of popular interpolation algorithms. I figured it out that C# in such simple algorithms will be much slower than C++ so i think of writing some native code and using it in my C# GUI. First of all i run some tests and few operations on 1024x1024x3 matrix took 32ms in C# and 4ms in C++ and that's what i basicly need. Interpolation however is not a good word because i need them only for downscaling. But the question is: Will it be faster than C# methods in Drawing2D Image outputImage = new Bitmap(destWidth, destHeight, PixelFormat.Format24bppRgb); Graphics grPhoto = Graphics.FromImage(outputImage); grPhoto.InterpolationMode = InterpolationMode.*; //all of them grPhoto.DrawImage(bmp, new Rectangle(0, 0, destWidth, destHeight), Rectangle(0, 0, sourceWidth, sourceHeight), GraphicsUnit.Pixel); grPhoto.Dispose(); Some of these method run in 20ms and some in 80. Is there a way to do it faster?

    Read the article

  • Creating a Linux Desktop Envoriment

    - by Alon
    Suppose I want to create my own desktop envoriment for Linux, without X. Like Google with the Android did. Where do I start? Is it actually a normal application that just draws stuff, and starts after the kernel boot? And how does it draw it? Using OpenGL or is there something more generic? And graphics drivers, how is it going? You should develop custom graphics drivers for your desktop or it comes with the Linux kernel? Note: It's for normal PCs and not embedded devices. Thanks.

    Read the article

  • Why do GPUs overheat?

    - by JAD
    About a year ago, I added a 9800GT (1 GB version) and a Corsair CX500 PSU to an HP M8000N computer. A few weeks ago, the HDD overheated and I decided to transfer the GPU & PSU to a new build, which consists of: i3 @ 3.3Ghz Gigabyte H61 Micro ATX Mobo 4GB RAM 500GB WD HDD DVD RW Drive Cooler Master Elite 430 Tower Once I had Win7 up and running, I installed all the essential drivers that came with the Gigabyte Mobo CD. However, whenever I tried installing the Graphics Media Accelerator driver, the computer would crash and enter an endless boot sequence on the next startup. I skipped installing this driver and installed the CD driver for the 9800GT, which by now is a year old. Everything was working fine, WEI rated my GPU at 6.6 graphics & aero performance. However, after updating my Nvidia drivers to the latest, the WEI dropped my rating to 3.3 for Aero, and 4.7 for graphics performance. Just to make sure that everything was ok, I ran Bad Company 2 on medium settings. The first few minutes ran just fine at a smooth framerate, so I dismissed this as Windows being Windows. About 6 hours later, I ran BC2 again. This time I averaged anywhere from 2-5 FPS. I checked the GPU temperature through GPU-Z, and it came back as 120C. The problem with this, is that the computer was on for six hours up to that point. Wouldn't the card have experienced a reactor core meltdown a lot sooner than that? Granted, the computer was "sleeping" some of the time, but still... The next day I took out a temperature gun and ran some tests. I would point the laser at a very specific area on the reverse side of the card (not the fan or "front"), and compare the temp reading with GPU-Z. After leaving the system on idle on idle for a few minutes, I ran BC2 twice. Here are the results: GPU-Z Reading / Temp Gun Reading / Time Null / 22.3°C / Comp is Off 53°C / 33.5°C / 1:49 78°C / 46°C / 1:53 - (First BC2 run; good framerate) 102°C / 64.6°C / 2:01 - (System is again on idle) 113°C / 64.8°C / 2:10 119°C / 71.8°C / 2:17 - (Second BC2 run; poor framerate) I should also mention that I also took a temp recording of another part of the GPU from 2:01-2:17. The temp in this area jumped from 75°C to 82.9°C in that time frame. This pretty much confirms that GPU-Z is reporting the temperature accurately, and the card is overheating. But I'd like to know why; the cars is doing nothing and still the temperature climbs at a steady rate. I thoroughly cleaned the GPU and PSU when I salvaged them from the old HP M8000N computer with a can of compressed air, dust cant be the issue. Similarly, the rest of the computer is brand new. I installed various Nvidia drivers, but no luck. It seems strange to me that a year-old card is suddenly failing on me; aren't they supposed to last at least two years? Could this be a driver issue? Is the motherboard faulty? Could the PSU be overfeeding the card on voltage? Neither case seems likely, as the CPU, RAM and otherwise the rest of the comp has worked flawlessly and has stayed well within respectable temp ranges (the i3 lingers around 50C, the HDD stays at 30C, so does the PSU). How can I pinpoint the issue?

    Read the article

  • How to deploy http://code.google.com/p/dyuproject/ into app engine

    - by portoalet
    Hi, I am trying to use openid/hybrid in app engine, but so far, no luck. No success with openid4java (because it creates socket etc), and no luck with dyuproject either. How do it deploy dyuproject into my java appengine? I just could not understand the different structure of the code in http://dyuproject.googlecode.com/files/dyuproject.appspot.com-source-2009-10-08.zip It is just so different than the default new google web application. Many thanks.. I have been struggling the whole week

    Read the article

  • Is there a web application equivalent of Hypercard?

    - by Gabriel Cuvillier
    Recently, I found an interesting Wiki/CMS/Database hybrid called Wagn, where the most important unit of information is the 'Card'. That terminology immediately made me think of Hypercard. As expected, there is some "Hypercard-ness" in that application. Do you know of other web applications/frameworks with that "Hypercard-ness" thing, or if its successor still must be invented? Note: I insist on web applications because I already know the desktop ones.

    Read the article

  • NHibernate with StructureMap for a Non-Web Application

    - by Yoann. B
    Hi, What is best pratices for inject and manage Session/Transaction for NHibernate using StructureMap for a Non Web Application like an Windows Service ? In a web context, we use PerRequest Session management lifecycle using the Hybrid Lifecycle of StructureMap but for a Windows Service, i can't handle IDisposable UnitOfWork ... Thanks.

    Read the article

  • protection points in survivable mutlicast network

    - by wantobegeek
    I am working on a project on survivable multicasting.I want to propose a hybrid scheme(protection and restoration) for that purpose.Can anyone help me with an approach to decide protection points in a multicast tree??(The protection points will be those points upto which there will be an alternate path from the multicast source(protection) and from protection point to the multicast destination the path will be dynamically restored.).Pls suggest an approach to find the protection points.I found an approach name caterpillar tree which assigns the nodes on the spine of caterpillar tree as protection points.Is there any other such approach..?

    Read the article

  • Bracketing algorithm when root finding. Single root in "quadratic" function

    - by Ander Biguri
    I am trying to implement a root finding algorithm. I am using the hybrid Newton-Raphson algorithm found in numerical recipes that works pretty nicely. But I have a problem in bracketing the root. While implementing the root finding algorithm I realised that in several cases my functions have 1 real root and all the other imaginary (several of them, usually 6 or 9). The only root I am interested is in the real one so the problem is not there. The thing is that the function approaches the root like a cubic function, touching with the point the y=0 axis... Newton-Rapson method needs some brackets of different sign and all the bracketing methods I found don't work for this specific case. What can I do? It is pretty important to find that root in my program... EDIT: more problems: sometimes due to reaaaaaally small numerical errors, say a variation of 1e-6 in some value the "cubic" function does NOT have that real root, it is just imaginary with a neglectable imaginary part... (checked with matlab) EDIT 2: Much more information about the problem. Ok, I need root finding algorithm. Info I have: The root I need to find is between [0-1] , if there are more roots outside that part I am not interested in them. The root is real, there may be imaginary roots, but I don't want them. Probably all the rest of the roots will be imaginary The root may be double in that point, but I think that actually doesn't mater in numerical analysis problems I need to use the root finding algorithm several times during the overall calculations, but the function will always be a polynomial In one of the particular cases of the root finding, my polynomial will be similar to a quadratic function that touches Y=0 with the point. Example of a real case: The coefficient may not be 100% precise and that really slight imprecision may make the function not to touch the Y=0 axis. I cannot solve for this specific case because in other cases it may be that the polynomial is pretty normal and doesn't make any "strange" thing. The method I am actually using is NewtonRaphson hybrid, where if the derivative is really small it makes a bisection instead of NewRaph (found in numerical recipes). Matlab's answer to the function on the image: roots: 0.853553390593276 + 0.353553390593278i 0.853553390593276 - 0.353553390593278i 0.146446609406726 + 0.353553390593273i 0.146446609406726 - 0.353553390593273i 0.499999999999996 + 0.000000040142134i 0.499999999999996 - 0.000000040142134i The function is a real example I prepared where I know that the answer I want is 0.5 Note: I still haven't check completely some of the answers I you people have give me (Thank you!), I am just trying to give al the information I already have to complete the question.

    Read the article

  • has anyone produced an in-memory GIT repository?

    - by Andrew Matthews
    I would like to be able to take advantage of the benefits of GIT (and its workflows), but without the cost of disk access - I just would like to leverage the distributed revision control capabilities of GIT to produce something like a hybrid of memcached and GIT. (preferably in .NET) Is there such a beast out there?

    Read the article

  • Should I invest time in learning Java language these days? (question from a greenhorn)

    - by dave-keiture
    Hi experts, Assuming you've already had a chance to look through the lambda syntax proposed for Java7 (and the other things that have happened with Java, after Oracle has bought Sun + obvious problems in Java Community Process), what do you think is the future of Java language? Should I, as a Java greenhorn, invest time in learning Java language (not talking about the core JVM, which definitely will survive anything, and worth investments), or concentrate on Scala, Groovy, or other hybrid languages on the JVM platform (I've came into Java world from PHP/Ruby). Thanks in advance.

    Read the article

  • How to use OpenID+OAuth in my website?

    - by Yuan
    I want to log in my website by using google account, now i can use google account to log in(by OpenID), but i don't know how to get user account and information in google? Just like below link(which is provided by google) http://googlecodesamples.com/hybrid/ This link can log in by user's google account, and list all the documents in user's google doc, so i guess by using OAuth can let me get user's account(such as [email protected]) and get relative information, but i don't know how to do? PS. I use php to write my website

    Read the article

  • mobile: html5 vs xhtml

    - by Sean
    I am building a mobile app (hybrid mobile web app but with a native shell) with most users on the iphone (some on the blackberry) and am wondering if it should be written in html5 or xhtml? Any insight would be great.

    Read the article

  • What’s new in Silverlight 4 RC?

    - by pluginbaby
    I am here in Las Vegas for MIX10 where Scott Guthrie announced today the release of Silverlight 4 RC and the Visual Studio 2010 tools. You can now install VS2010 RC!!! As always, downloads links are here: www.silverlight.net He also said that the final version of Silverlight 4 will come next month (so april)! 4 months ago, I wrote a blog post on the new features of Silverlight 4 beta, so… what’s new in the RC ?   Rich Text · RichTextArea renamed to RichTextBox · Text position and selection APIs · “Xaml” property for serializing text content · XAML clipboard format · FlowDirection support on Runs tag · “Format then type” support when dragging controls to the designer · Thai/Vietnamese/Indic support · UI Automation Text pattern   Networking · UploadProgress support (Client stack) · Caching support (Client stack) · Sockets security restrictions removal (Elevated Trust) · Sockets policy file retrieval via HTTP · Accept-Language header   Out of Browser (Elevated Trust) · XAP signing · Silent install and emulation mode · Custom window chrome · Better support for COM Automation · Cancellable shutdown event · Updated security dialogs   Media · Pinned full-screen mode on secondary display · Webcam/Mic configuration preview · More descriptive MediaSourceStream errors · Content & Output protection updates · Updates to H.264 content protection (ClearNAL) · Digital Constraint Token · CGMS-A · Multicast · Graphics card driver validation & revocation   Graphics and Printing · HW accelerated Perspective Transforms · Ability to query page size and printable area · Memory usage and perf improvements   Data · Entity-level validation support of INotifyDataErrorInfo for DataGrid · XPath support for XML   Parser · New architecture enables future innovation · Performance and stability improvements · XmlnsPrefix & XmlnsDefinition attributes · Support setting order-dependent properties   Globalization & Localization · Support for 31 new languages · Arabic, Hebrew and Thai input on Mac · Indic support   More … · Update to DeepZoom code base with HW acceleration · Support for Private mode browsing · Google Chrome support (Windows) · FrameworkElement.Unloaded event · HTML Hosting accessibility · IsoStore perf improvements · Native hosting perf improvements (e.g., Bing Toolbar) · Consistency with Silverlight for Mobile APIs and Tooling · SDK   - System.Numerics.dll   - Dynamic XAP support (MEF)   - Frame/Navigation refresh support   That’s a lot!   You will find more details on the following links: http://timheuer.com/blog/archive/2010/03/15/whats-new-in-silverlight-4-rc-mix10.aspx http://www.davidpoll.com/2010/03/15/new-in-the-silverlight-4-rc-xaml-features/   Technorati Tags: Silverlight

    Read the article

  • LightDM will not start after stopping it

    - by Sweeters
    I am running Ubuntu 11.10 "Oneiric Ocelot", and in trying to install the nvidia CUDA developer drivers I switched to a virtual terminal (Ctrl-Alt-F5) and stopped lightdm (installation required that no X server instance be running) through sudo service lightdm stop. Re-starting lightdm with sudo service lightdm start did not work: A couple of * Starting [...] lines where displayed, but the process hanged. (I do not remember at which point, but I think it was * Starting System V runlevel compatibility. I manually rebooted my laptop, and ever since booting seems to hang, usually around the * Starting anac(h)ronistic cron [OK] log line (not consistently at that point, though). From that point on, I seem to be able to interact with my system only through a tty session (Ctrl-Alt-F1). I've tried purging and reinstalling both lightdm and gdm, as well as selecting both as the default display managers (through sudo dpkg-reconfigure [lightdm / gdm] or by manually editing /etc/X11/default-display-manager) through both apt-get and aptitude (that shouldn't make a difference anyway) after updating the packages, but the problem persists. Some of the responses I'm getting are the following: After running sudo dpkg-reconfigure lightdm (but not ... gdm) I get the following message: dpkg-maintscript-helper:warning: environment variable DPKG_MATINSCRIPT_NAME missing dpkg-maintscript-helper:warning: environment variable DPKG_MATINSCRIPT_PACKAGE missing After trying sudo service lightdm start or sudo start lightdm I get to see the boot loading screen again but nothing changes. If I go back to the tty shell I see lightdm start/running, process <num> but ps -e | grep lightdm gives no output. After trying sudo service gdm start or sudo starg gdm I get the gdm start/running, process <num> message, and gdm-binary is supposedly an active process, but all that happens is that the screen blinks a couple of times and nothing else. Other candidate solutions that I'd found on the web included running startx but when I try that I get an error output [...] Fatal server error: no screens found [...]. Moreover, I made sure that lightdm-gtk-greeter is installed but that did not help either. Please excuse my not including complete outputs/logs; I am writing this post from another computer and it's hard to manually copy the complete logs. Also, I've seen several posts that had to do with similar problems, but either there was no fix, or the one suggested did not work for me. In closing: Please help! I very much hope to avoid re-installing Ubuntu from scratch! :) Alex @mosi I did not manage to fix the NVIDIA kernel driver as per your instructions. I should perhaps mention that I'm on a Dell XPS15 laptop with an NVIDIA Optimus graphics card, and that I have bumblebee installed (which installs nvidia drivers during its installation, I believe). Issuing the mentioned commands I get the following: ~$uname -r 3.0.0-12-generic ~$lsmod | grep -i nvidia nvidia 11713772 0 ~$dmesg | grep -i nvidia [ 8.980041] nvidia: module license 'NVIDIA' taints kernel. [ 9.354860] nvidia 0000:01:00.0: power state changed by ACPI to D0 [ 9.354864] nvidia 0000:01:00.0: power state changed by ACPI to D0 [ 9.354868] nvidia 0000:01:00.0: enabling device (0006 -> 0007) [ 9.354873] nvidia 0000:01:00.0: PCI INT A -> GSI 16 (level, low) -> IRQ 16 [ 9.354879] nvidia 0000:01:00.0: setting latency timer to 64 [ 9.355052] NVRM: loading NVIDIA UNIX x86_64 Kernel Module 280.13 Wed Jul 27 16:53:56 PDT 2011 Also, running aptitude search nvidia gives me the following: p nvidia-173 - NVIDIA binary Xorg driver, kernel module a p nvidia-173-dev - NVIDIA binary Xorg driver development file p nvidia-173-updates - NVIDIA binary Xorg driver, kernel module a p nvidia-173-updates-dev - NVIDIA binary Xorg driver development file p nvidia-96 - NVIDIA binary Xorg driver, kernel module a p nvidia-96-dev - NVIDIA binary Xorg driver development file p nvidia-96-updates - NVIDIA binary Xorg driver, kernel module a p nvidia-96-updates-dev - NVIDIA binary Xorg driver development file p nvidia-cg-toolkit - Cg Toolkit - GPU Shader Authoring Language p nvidia-common - Find obsolete NVIDIA drivers i nvidia-current - NVIDIA binary Xorg driver, kernel module a p nvidia-current-dev - NVIDIA binary Xorg driver development file c nvidia-current-updates - NVIDIA binary Xorg driver, kernel module a p nvidia-current-updates-dev - NVIDIA binary Xorg driver development file i nvidia-settings - Tool of configuring the NVIDIA graphics dr p nvidia-settings-updates - Tool of configuring the NVIDIA graphics dr v nvidia-va-driver - v nvidia-va-driver - I've tried manually installing (sudo aptitude install <package>) packages nvidia-common and nvidia-settings-updates but to no avail. For example, sudo aptitude install nvidia-settings-updates returns the following log: Reading package lists... Building dependency tree... Reading state information... Reading extended state information... Initializing package states... Writing extended state information... No packages will be installed, upgraded, or removed. 0 packages upgraded, 0 newly installed, 0 to remove and 83 not upgraded. Need to get 0 B of archives. After unpacking 0 B will be used. Writing extended state information... Reading package lists... Building dependency tree... Reading state information... Reading extended state information... Initializing package states... Writing extended state information... The same happens with the Linux headers (i.e. I cannot seem to be able to install linux-headers-3.0.0-12-generic). The output of aptitude search linux-headers is as follows: v linux-headers - v linux-headers - v linux-headers-2.6 - i linux-headers-2.6.38-11 - Header files related to Linux kernel versi i linux-headers-2.6.38-11-generic - Linux kernel headers for version 2.6.38 on i A linux-headers-2.6.38-8 - Header files related to Linux kernel versi i A linux-headers-2.6.38-8-generic - Linux kernel headers for version 2.6.38 on v linux-headers-3 - v linux-headers-3.0 - v linux-headers-3.0 - i A linux-headers-3.0.0-12 - Header files related to Linux kernel versi p linux-headers-3.0.0-12-generic - Linux kernel headers for version 3.0.0 on p linux-headers-3.0.0-12-generic- - Linux kernel headers for version 3.0.0 on p linux-headers-3.0.0-12-server - Linux kernel headers for version 3.0.0 on p linux-headers-3.0.0-12-virtual - Linux kernel headers for version 3.0.0 on p linux-headers-generic - Generic Linux kernel headers p linux-headers-generic-pae - Generic Linux kernel headers v linux-headers-lbm - v linux-headers-lbm - v linux-headers-lbm-2.6 - v linux-headers-lbm-2.6 - p linux-headers-lbm-3.0.0-12-gene - Header files related to linux-backports-mo p linux-headers-lbm-3.0.0-12-gene - Header files related to linux-backports-mo p linux-headers-lbm-3.0.0-12-serv - Header files related to linux-backports-mo p linux-headers-server - Linux kernel headers on Server Equipment. p linux-headers-virtual - Linux kernel headers for virtual machines @heartsmagic I did try purging and reinstalling any nvidia driver packages, but it did not seem to make a difference, My xorg.conf file contains the following: # nvidia-xconfig: X configuration file generated by nvidia-xconfig # nvidia-xconfig: version 280.13 ([email protected]) Wed Jul 27 17:15:58 PDT 2011 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" Identifier "Monitor0" VendorName "Unknown" ModelName "Unknown" HorizSync 28.0 - 33.0 VertRefresh 43.0 - 72.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

< Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >