Search Results

Search found 5873 results on 235 pages for 'raster graphics'.

Page 33/235 | < Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >

  • Rendering a DOM across multiple displays

    - by meetamit
    I'm building a data-driven animation with HTML and javascript to run in a web browser. I would like to display it tiled across three 1080p monitors. This essentially yields a viewport that's 5760px wide and 1080px tall. Pretty large. Does anyone have experience setting up something like this? I have many questions below, but any tip would be appreciated: Is it reasonable to expect a DOM to render into such a large viewport size at close to 60fps? I might choose to use canvas, instead of SVG or HTML, but that would yield a giant canvas. Can a canvas with such high resolution be performant? Of course everything depends on the complexity of the graphics I want to render, but I'm looking to remove that factor from this question, so assume I'm asking about a canvas animation that can run at 60fps at 1920x1080 resolution. Would it run roughly as fast at 3 times the width? Would three.js and WebGL be a more proper approach at that resolution? How do you actually cause Chrome or FF to span 3 monitors at full screen? Do I need a 3rd party solution of any kind? Thanks!

    Read the article

  • How can I get my monitor's maximum resolution without the proprietary AMD graphic driver installed?

    - by Venki
    I am using Ubuntu 14.04. I have an AMD Radeon 5570 HD graphic card. Actually, the default open source REDWOOD drivers aren't allowing me to choose my monitor's maximum screen resolution(which is 1366 x 768). I just have two resolutions displayed which are 1024x768 and 800x600 . If I give the command : xrandr -s 1366x768 then the output is: Size 1366x768 not found in available modes So just for the sake of getting 1366x768 resolution I am forced to install the proprietary graphic driver that AMD gives me from its site. But if I install it(which itself is quite a problem-prone process), I undergo a lot of 'inconvenience'. Sometimes after an OS update, the driver crashes unity. Then I will have to uninstall that driver from a tty and google around for a solution. Also I encounter screen tearing problems occasionally. In addition I also cant see my login screen(See this question which states this particular problem). The main problem is AMD does not update its driver as quick as Ubuntu updates its OS. This is quite irritating. So, I want the maximum resolution(and performance) that my graphics card and monitor can give me without installing the 'problematic' proprietary graphic card driver that AMD gives. Is this possible? Suggestions please. Thanks in advance. PS :- More system specs details:- Intel i3 2100 processor AMD P8H61-M PLUS2 motherboard AMD Radeon 5570 HD graphic card DELL monitor (BTW, Thank you for reading through my elaborate description!)

    Read the article

  • How to disable discrete GPU using NVIDIA drivers?

    - by penzoiders
    I have a DELL studio XPS 13 (aka 1340) as of 12.04 most things run smoothly out of the box, but I have some power draining and warmness issues (if not to be called terrible heat issues) The system came with a NVIDIA GeForce 9500M (which has Hybrid SLI) and it shows up in "lspci" as these 2 cards 02:00.0 VGA compatible controller: NVIDIA Corporation G98 [GeForce 9200M GS] (rev a1) 03:00.0 VGA compatible controller: NVIDIA Corporation C79 [GeForce 9400M G] (rev b1) I had to install nvidia-current over noveau driver 'cause noveau does freeze the system after suspension. By installing nvidia-current and running nvidia-xconfig the resume process after suspension is fixed. By the way both with nvidia-current and noveau the system drains a lot of battery and heats up a lot. I suppose this is because the discrete GPU is always on. I don't really need 3D graphics on this system, if not the minimal to run unity and compiz for window management. So my question is: How do I disable, using nvidia-current, the discrete GPU 9200M and use only the integrated one 9400M? notes: In BIOS I have no option to disable discrete GPU This I think is not applicable because of the suspension-freeze issue (with noveau): https://help.ubuntu.com/community/HybridGraphics I've found this but I don't know which --sli option I should choose to fit my needs: http://manpages.ubuntu.com/manpages/hardy/man1/nvidia-xconfig.1.html My system has not optimus or cuda, but anyone can tell me if bumblebee can work for me?

    Read the article

  • What is wrong with my speculair phong shading

    - by Thijser
    I'm sorry if this should be placed on stackoverflow instead however seeing as this is graphics related I was hoping you guys could help me: I'm attempting to write a phong shader and currently working on the specular. I came acros the following formula: base*pow(dot(V,R),shininess) and attempted to implement it (V is the posion of the viewer and R the reflective vector). This gave the following result and code: Vec3Df phongSpecular(const Vec3Df & vertexPos, Vec3Df & normal, const Vec3Df & lightPos, const Vec3Df & cameraPos, unsigned int index) { Vec3Df relativeLightPos=(lightPos-vertexPos); relativeLightPos.normalize(); Vec3Df relativeCameraPos= (cameraPos-vertexPos); relativeCameraPos.normalize(); int DotOfNormalAndLight = Vec3Df::dotProduct(normal,relativeLightPos); Vec3Df reflective =(relativeLightPos-(2*DotOfNormalAndLight*normal))*-1; reflective.normalize(); float phongyness= Vec3Df::dotProduct(reflective,relativeCameraPos); if (phongyness<0){ phongyness=0; } float shininess= Shininess[index]; float speculair = powf(phongyness,shininess); return Ks[index]*speculair; } I'm looking for something more like this:

    Read the article

  • Scale an image with unscalable parts

    - by Uko
    Brief description of problem: imagine having some vector picture(s) and text annotations on the sides outside of the picture(s). Now the task is to scale the whole composition while preserving the aspect ratio in order to fit some view-port. The tricky part is that the text is not scalable only the picture(s). The distance between text and the image is still relative to the whole image, but the text size is always a constant. Example: let's assume that our total composition is two times larger than a view-port. Then we can just scale it by 1/2. But because the text parts are a fixed font size, they will become larger than we expect and won't fit in the view-port. One option I can think of is an iterative process where we repeatedly scale our composition until the delta between it and the view-port satisfies some precision. But this algorithm is quite costly as it involves working with the graphics and the image may be composed of a lot of components which will lead to a lot of matrix computations. What's more, this solution seems to be hard to debug, extend, etc. Are there any other approaches to solving this scaling problem?

    Read the article

  • intel HD graphics with integrated tv tuner

    - by Tamir
    Hi all! I have new Dell laptop with Intel HD graphics with integrated TV tuner. How can i use this TV tuner? should I install third party software for using it or just configure something? I tried to google it but couldn't find a thing :-( so, How can I use this TV tuner? Many thanks!

    Read the article

  • Which is a better graphics card

    - by michael
    Hi, Can someone please give me advice regarding which of the following is a better graphics cards? Radeon HD4650 1GB NVIDIA Geforce GT240M 1GB Or which brand is better in general? Radeon? or Nvidia? Thank you.

    Read the article

  • AS3 Embed Image Instead of drawing graphics

    - by David
    I've got a custom AS3 scrollbar that I need to modify. The scroll thumb (the draggable part of a scrollbar) is currently just done by drawing a rectangle. But I need it to look more like a real scrollbar. Not sure how to modify the below code to import/use a scroll thumb image: scrollThumb = new Sprite(); scrollThumb.graphics.lineStyle(); scrollThumb.graphics.beginFill(0x0066ff); scrollThumb.graphics.drawRect(0, 0, 1, 1); addChild(scrollThumb); I know that I would embed an image by doing something like this: [Embed(source="images/image1.png")] private static var Image1Class:Class; But then how do I set the scrollThumb = to the image? Thanks!

    Read the article

  • How can let Qt Graphics View Framework support custom layers

    - by jnblue
    Qt's graphics view frameworks is very powerful, but I have not found a way to support custom layers. In Qt, there is a QGraphicsScene::ItemLayer,but QGraphicsScene renders all items are in this layer. I want manage the items with several layers, Just like Illustrator and CorelDraw. all the item only in the current layer will receive the event, be selected or get the key focus etc.. Other layers(not current layer) will not receive all scene event. The most reasons of using layers is I could catalogue a large number of items more clearly.And without needing transfer events to all the layers' items ,I think the graphics frameworks will be more efficient. The last question, does QGraphicsView support rendering server stacked graphics scenes at the same time? If support, I think the "custom layers" can be solved in this way. Thanks very much!

    Read the article

  • Playing with Graphics in Flex

    - by Anoop
    Hi All, I was just going through one code used to draw one chart. This code is written in the updateDisplayList of the itemrenderer of column chart. I am not good at the graphics part of flex. Can anybody please explain me what this code is doing. I can see the final output, but am not sure how is this achieved. var rc:Rectangle = new Rectangle(0, 0, width , height ); var g:Graphics = graphics; g.clear(); g.moveTo(rc.left,rc.top); g.beginFill(fill); g.lineTo(rc.right,rc.top); g.lineTo(rc.right,rc.bottom); g.lineTo(rc.left,rc.bottom); g.lineTo(rc.left,rc.top); g.endFill(); Regards, PK

    Read the article

  • as3 this.graphics calls do nothing

    - by zzz
    class A: [SWF(width='800',height='600',frameRate='24')] public class A extends MovieClip { private var b:B; public function A(){ super(); b = new B(); addChild(b); addEventListener(Event.ENTER_FRAME, update); } private function update(e:Event):void { b.draw(); } } class B: public class B extends MovieClip { public function draw():void { //! following code works well if put in constructor, but not here this.graphics.beginFill(0xff0000); this.graphics.drawCircle(200,200,50); } } this.graphics calls do nothing in draw method, but work fine inside B`s constructor, what i am doing wrong ?

    Read the article

  • Why is my second monitor not working?

    - by StampedeXV
    Since I have my new computer, I have a very weird problem. Facts: New Computer: Motherboard: ASRock Z77 Pro 3 Graphics-card: Asus1GB D5 X EN GTX560 DCII OC/2DI R CPU: Intel i5-3570 Windows 7 64bit 500W beQuiet special edition (92% efficiency) 8GB 1333MHz DDR3 Corsair RAM (CL9) Scythe Mugen 2 2 magnetic HDDs + 1 SDD 1 DVD-R Old Computer: Motherboard: Asus P55 something Graphics-card: Asus1GB D5 X EN GTX560 DCII OC/2DI R CPU: Intel i7-870 Windows 7 64bit 550W Corsair 8GB 1333MHz DDR3 Corsair RAM (CL9) Scythe Mugen 3 2 magnetic HDDs + 1 SDD 1 DVD-R On the old computer it worked fine with two monitors. Moving to the new (I took the same Graphics-card) it only works with one. The weird thing I mentioned is: not matter which one. But if I put both there, only one is available. There is no reaction at the start (where normally (at least if I remember correctly) the monitor shortly went from "standby" to "on"). Windows does not recognize a second monitor in the Device Manager. I have the latest drivers for Motherboard and Graphics-card. I have the latest BIOS drivers. I am out of ideas. Edit: completed computer setup

    Read the article

  • If I run two monitors from two different graphic cards, can I still have Twinview?

    - by rumtscho
    I am planning to get a second 2560x1440 monitor for home. The trouble is, I only have 1xDVI, 1xVGA on my graphics card (a 250 GT). I don't want to buy a new graphics card until the prices for the 500 series have stabilized, so probably not before summer (or will it happen earlier? I don't remember how it was for other series, and I couldn't find long-term price history for video cards). The solution I had in mind is to get the 7600 GS from my old PC, which also has 1xDVI, 1xVGA, and run each monitor on a separate card. I have never done that, and I was wondering 1. If I will be able to run the monitors in Twinview then, or will I be stuck with separate X sessions, and 2. Whether there are some other disadvantages as compared to a single-head graphics card. (I am using the proprietary driver because I need compiz). As an aside, how do I find out whether the DVI port on the old graphics card is dual link?

    Read the article

  • Unity desktop "smears" (doesn't refresh) and shows no wallpaper

    - by Cedric Reichenbach
    Since a couple of days now, my unity desktop background smears everything, just like what old Windows versions were famous for: Of course, I tried rebooting a couple of times. Also, I switched graphics driver and I tried to change wallpaper and theme, but none of them solved the problem. What could be causing that problem, and where can I search on for its source? Infomation update I'm using Ubuntu 13.04 (not updated to 13.10 yet). The following command were all run from cinnamon (on the same Ubuntu installation). sudo lsb_release -a: No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 13.04 Release: 13.04 Codename: raring sudo uname -a: Linux cedric-MacBookPro 3.8.0-32-generic #47-Ubuntu SMP Tue Oct 1 22:35:23 UTC 2013 x86_64 x86_64 x86_64 GNU/Linux sudo dpkg -l | grep xserver-xorg-video: ii xserver-xorg-video-all 1:7.7+1ubuntu4 amd64 X.Org X server -- output driver metapackage ii xserver-xorg-video-ati 1:7.1.0-0ubuntu2 amd64 X.Org X server -- AMD/ATI display driver wrapper ii xserver-xorg-video-cirrus 1:1.5.2-0ubuntu1 amd64 X.Org X server -- Cirrus display driver ii xserver-xorg-video-fbdev 1:0.4.3-0ubuntu1 amd64 X.Org X server -- fbdev display driver ii xserver-xorg-video-intel 2:2.21.6-0ubuntu4.3 amd64 X.Org X server -- Intel i8xx, i9xx display driver ii xserver-xorg-video-mach64 6.9.3-0ubuntu1 amd64 X.Org X server -- ATI Mach64 display driver ii xserver-xorg-video-mga 1:1.6.2-0ubuntu1 amd64 X.Org X server -- MGA display driver ii xserver-xorg-video-modesetting 0.7.0-0ubuntu2 amd64 X.Org X server -- Generic modesetting driver ii xserver-xorg-video-neomagic 1:1.2.7-0ubuntu1 amd64 X.Org X server -- Neomagic display driver ii xserver-xorg-video-nouveau 1:1.0.7-0ubuntu1 amd64 X.Org X server -- Nouveau display driver ii xserver-xorg-video-openchrome 1:0.3.1-0ubuntu1.13.04.1 amd64 X.Org X server -- VIA display driver ii xserver-xorg-video-qxl 0.1.0-0ubuntu3 amd64 X.Org X server -- QXL display driver ii xserver-xorg-video-r128 6.9.1-0ubuntu1 amd64 X.Org X server -- ATI r128 display driver ii xserver-xorg-video-radeon 1:7.1.0-0ubuntu2 amd64 X.Org X server -- AMD/ATI Radeon display driver ii xserver-xorg-video-s3 1:0.6.5-0ubuntu3 amd64 X.Org X server -- legacy S3 display driver ii xserver-xorg-video-savage 1:2.3.6-0ubuntu1 amd64 X.Org X server -- Savage display driver ii xserver-xorg-video-siliconmotion 1:1.7.7-0ubuntu1 amd64 X.Org X server -- SiliconMotion display driver ii xserver-xorg-video-sis 1:0.10.7-0ubuntu1 amd64 X.Org X server -- SiS display driver ii xserver-xorg-video-sisusb 1:0.9.6-0ubuntu1 amd64 X.Org X server -- SiS USB display driver ii xserver-xorg-video-tdfx 1:1.4.5-0ubuntu1 amd64 X.Org X server -- tdfx display driver ii xserver-xorg-video-trident 1:1.3.6-0ubuntu2 amd64 X.Org X server -- Trident display driver ii xserver-xorg-video-vesa 1:2.3.2-0ubuntu1 amd64 X.Org X server -- VESA display driver ii xserver-xorg-video-vmware 1:12.0.2+git.e5ac80d8-0ubuntu1 amd64 X.Org X server -- VMware display driver sudo lspci | grep VGA: 01:00.0 VGA compatible controller: NVIDIA Corporation GT216M [GeForce GT 330M] (rev a2)

    Read the article

  • GPU hung when switching graphic card

    - by Lie Ryan
    I have a laptop (Dell Inspiron N4110) with a switchable graphic. $ lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: ATI Technologies Inc NI Whistler [AMD Radeon HD 6600M Series] (rev ff) Normally, my laptop starts with both graphic cards enabled, which caused the laptop to turn very hot and the fan to become very noisy. I have been using a small script to disable the Radeon card. For some time, I'm quite happy with this arrangement. However, I have been having some issues with the Intel card (IGD), the Intel card often randomly hang when running OpenGL apps; and so I want to give the Radeon card (DIS) another chance. I have never been able to switch to the Radeon card, but recently, I found out that if I do a "delayed switching" (DDIS): # echo "DDIS" > /sys/kernel/debug/vgaswitcheroo/switch root@lieryan-dell-ubuntu:/sys/kernel/debug/vgaswitcheroo# cat switch 0:IGD:+:Pwr:0000:00:02.0 1:DIS: :Pwr:0000:01:00.0 then I logoff (i.e. to restart X), the screen switch to pseudo-tty and then it stuck there freezing. At this situation, mouse and keyboard stops working so I can't switch to another ptty. I tried ssh-ing from another computer to salvage logs (dmesg at that point) and whatnot; I found out that when freezing, the active graphic card is the AMD card: -- this is from ssh -- # cat switch 0:IGD: :Off:0000:00:02.0 1:DIS:+:Pwr:0000:01:00.0 but the GPU is apparently hung, looking at dmesg gives: ... [ 1411.649974] vga_switcheroo: client 0 refused switch [ 1411.649985] vga_switcheroo: setting delayed switch to client 1 [ 1423.911759] vga_switcheroo: processing delayed switch to 1 [ 1424.006564] fbcon: Remapping primary device, fb1, to tty 1-63 [ 1424.006799] i915: switched off [ 1424.840351] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1425.718088] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1426.622377] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1427.355683] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1428.193549] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id ... the invalid framebuffer id error is repeated for many times over ... I were able to successfully recover by switching back to the Intel card and restarting X from ssh; indicating that only the Radeon card has problems switching. System info: $ uname -a Linux lieryan-dell-ubuntu 3.0.0-14-generic #23-Ubuntu SMP Mon Nov 21 20:28:43 UTC 2011 x86_64 x86_64 x86_64 GNU/Linux $ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 11.10 Release: 11.10 Codename: oneiric The laptop also do not have the option to set graphic card at BIOS and the proprietary driver, fglrx, also have never worked; when I installed it through jockey ("Additional Drivers"), glxinfo showed that it still being rendered by Mesa, the /sys/kernel/debug/vgaswitcheroo directory has gone missing, and the driver crashes with a traceback if I use xorg.conf to tell X to use fglrx. Anyone had any idea if it is possible to use this AMD card either with the radeon or the fglrx driver? logs: dmesg

    Read the article

  • Cannot get 3D OpenGL support in Vmware guests, how can I fix this?

    - by jjapol
    I have been working at this problem for 2 days now. I cannot for the life of me enable 3D support in VMWare 9 guests. My specifications are: Hardware: Dell Latitude E5520 laptop. Processor: Intel i7-2620M CPU @ 2.70GHz × 4. Memory: 8GB. Video: Intel Sandybridge Mobile x86/MMX/SSE2 OS: Ubuntu 12.04.1 LTS, 32 bit. Vmware Workstation: 9.0.1 build-894247 Glxgears functions fine. Frame rate is ~60fps. Vmware guest: Windows 7 Starting the Windows 7 guest in VMware throws the following errors: No 3D support is available from the host. and Hardware graphics acceleration is not available. I've read through this VMware forum thread, but again the hardware in the post is different (nVidia). I've followed the instructions at this Ask Ubuntu post as closely as possible as the question is nearly the same as mine although my hardware is different. Answer 1 regarding setting mks.gl.allowBlacklistedDrivers = TRUE; in my vmx configuration file causes the VM to crash when it starts. The second answer I followed as closely as possible. I uninstalled VMware, Did sudo apt-get install build-essential linux-headers-$(uname -r) at a terminal, Added the PPA https://launchpad.net/~glasen/+archive/intel-driver, Then at a terminal did sudo apt-get update && sudo apt-get upgrade -y I reinstalled VMware and have the same results: no 3D in guests. I'm getting the feeling that something is awry with the Sandy Bridge driver, but I can't seem to come up with any solutions. Has anyone out there run across this problem also? By the way, the operation of the likes of Solidworks and AutoCad within a Windpws 7 guest does appear to be improved in VMware 9 vs VMware 8 in spite of the fact that 3D support is lacking in the Windows 7 guest. I'd also add that my glxinfo file was nearly identical to the glxinfo file posted at askubuntu.com/questions/181829/…. I had a total of seven minor differences per a comparison using Meld. –

    Read the article

  • vga_switcheroo and Intel HD 3000 on Ubuntu 12.04

    - by Ikalou
    I'm trying to get vga_switcheroo to enable my integrated Intel HD 3000 instead of my ATI card. My problem is that there is no vgaswitcheroo directory in /sys/kernel/debug/ on my system. > grep -i switcheroo /boot/config-3.2.0-26-generic CONFIG_VGA_SWITCHEROO=y And yet: > sudo ls /sys/kernel/debug/ acpi bdi bluetooth dri extfrag gpio ieee80211 kprobes mce mmc0 regmap regulator sched_features suspend_stats tracing usb wakeup_sources x86 I am NOT using the fglrx driver. Here is the output of lspci; glxinfo | grep renderer: 00:00.0 Host bridge: Intel Corporation 2nd Generation Core Processor Family DRAM Controller (rev 09) 00:01.0 PCI bridge: Intel Corporation Xeon E3-1200/2nd Generation Core Processor Family PCI Express Root Port (rev 09) 00:16.0 Communication controller: Intel Corporation 6 Series/C200 Series Chipset Family MEI Controller #1 (rev 04) 00:19.0 Ethernet controller: Intel Corporation 82579LM Gigabit Network Connection (rev 04) 00:1a.0 USB controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #2 (rev 04) 00:1b.0 Audio device: Intel Corporation 6 Series/C200 Series Chipset Family High Definition Audio Controller (rev 04) 00:1c.0 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 1 (rev b4) 00:1c.1 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 2 (rev b4) 00:1c.2 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 3 (rev b4) 00:1c.3 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 4 (rev b4) 00:1c.7 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 8 (rev b4) 00:1d.0 USB controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #1 (rev 04) 00:1f.0 ISA bridge: Intel Corporation QM67 Express Chipset Family LPC Controller (rev 04) 00:1f.2 SATA controller: Intel Corporation 6 Series/C200 Series Chipset Family 6 port SATA AHCI Controller (rev 04) 01:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Seymour [Radeon HD 6400M Series] 01:00.1 Audio device: Advanced Micro Devices [AMD] nee ATI Caicos HDMI Audio [Radeon HD 6400 Series] 24:00.0 FireWire (IEEE 1394): JMicron Technology Corp. IEEE 1394 Host Controller (rev 30) 24:00.1 System peripheral: JMicron Technology Corp. SD/MMC Host Controller (rev 30) 24:00.2 SD Host controller: JMicron Technology Corp. Standard SD Host Controller (rev 30) 25:00.0 Network controller: Intel Corporation Centrino Advanced-N 6205 (rev 34) 26:00.0 USB controller: NEC Corporation uPD720200 USB 3.0 Host Controller (rev 04) OpenGL renderer string: Gallium 0.4 on AMD CAICOS Both xserver-xorg-video-intel and xserver-xorg-video-radeon packages are installed. I know there are tons of posts about hybrid-graphics already but I couldn't quite find a solution to my problem. Does anyone know why is /sys/kernel/debug/vgaswitcheroo not showing?

    Read the article

  • Can't remove JPanel from JFrame while adding new class into it

    - by A.K.
    Basically, I have my Frame class, which instantiates all the properties for the JFrame, and draws a JLabel with an image (my title screen). Then I made a separate JPanel with a start button on it, and made a mouse listener that will allow me to remove these objects while adding in a new Board() class (Which paints the main game). *Note: The JLabel is SEPARATE from the JPanel, but it still gets moved to the side by it. Problem: Whenever I click the button though, it only shows a little square of what I presume is my board class trying to run. Code below for the Frame Class: package OurPackage; //Made By A.K. 5/24/12 //Contains Frame. import java.awt.BorderLayout; import java.awt.Color; import java.awt.Container; import java.awt.Dimension; import java.awt.Graphics; import java.awt.Graphics2D; import java.awt.GridBagLayout; import java.awt.GridLayout; import java.awt.Image; import java.awt.Rectangle; import java.awt.event.ActionEvent; import java.awt.event.ActionListener; import java.awt.event.KeyEvent; import java.awt.event.MouseAdapter; import java.awt.event.MouseEvent; import java.awt.event.MouseListener; import javax.swing.*; import javax.swing.plaf.basic.BasicOptionPaneUI.ButtonActionListener; public class Frame implements MouseListener { public static boolean StartGame = false; ImageIcon img = new ImageIcon(getClass().getResource("/Images/ActionJackTitle.png")); ImageIcon StartImg = new ImageIcon(getClass().getResource("/Images/JackStart.png")); public Image Title; JLabel TitleL = new JLabel(img); public JPanel panel = new JPanel(); JButton StartB = new JButton(StartImg); JFrame frm = new JFrame("Action-Packed Jack"); public Frame() { TitleL.setPreferredSize(new Dimension(1200, 420)); frm.add(TitleL); frm.setLayout(new GridBagLayout()); frm.add(panel); panel.setSize(new Dimension(220, 45)); panel.setLayout(new GridBagLayout ()); panel.add(StartB); StartB.addMouseListener(this); StartB.setPreferredSize(new Dimension(220, 45)); frm.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); frm.setSize(1200, 420); frm.setVisible(true); frm.setResizable(false); frm.setLocationRelativeTo(null); } public static void main(String[] args) { new Frame(); } public void mouseClicked(MouseEvent e) { StartB.setContentAreaFilled(false); panel.remove(StartB); frm.remove(panel); frm.remove(TitleL); //frm.setLayout(null); frm.add(new Board()); //Add Game "Tiles" Or Content. x = 1200 frm.validate(); System.out.println("Hit!"); } @Override public void mouseEntered(MouseEvent arg0) { // TODO Auto-generated method stub } @Override public void mouseExited(MouseEvent arg0) { // TODO Auto-generated method stub } @Override public void mousePressed(MouseEvent arg0) { // TODO Auto-generated method stub } @Override public void mouseReleased(MouseEvent arg0) { // TODO Auto-generated method stub } }

    Read the article

  • nvidia driver problems after upgrading to 3.2.0-26 on Ubuntu 12.04 64bit

    - by Lev Levitsky
    After installing latest updates I can't set screen resolution higher than 1024x768; every time after the boot I get a message Could not apply the stored configuration for the monitors (Note: removing ~/.config/monitors.xml stopped the message, but not the problem) I can boot with 3.2.0-25 and the graphics look normal. Here's what I have in /var/log/apt/term.log (excerpt): Setting up linux-image-3.2.0-26-generic (3.2.0-26.41) ... Running depmod. update-initramfs: deferring update (hook will be called later) Examining /etc/kernel/postinst.d. run-parts: executing /etc/kernel/postinst.d/dkms 3.2.0-26-generic /boot/vmlinuz-3.2.0-26-generic Error! Problems with depmod detected. Automatically uninstalling this module. DKMS: Install Failed (depmod problems). Module rolled back to built state. run-parts: executing /etc/kernel/postinst.d/initramfs-tools 3.2.0-26-generic /boot/vmlinuz-3.2.0-26-generic update-initramfs: Generating /boot/initrd.img-3.2.0-26-generic run-parts: executing /etc/kernel/postinst.d/pm-utils 3.2.0-26-generic /boot/vmlinuz-3.2.0-26-generic run-parts: executing /etc/kernel/postinst.d/update-notifier 3.2.0-26-generic /boot/vmlinuz-3.2.0-26-generic run-parts: executing /etc/kernel/postinst.d/zz-update-grub 3.2.0-26-generic /boot/vmlinuz-3.2.0-26-generic Generating grub.cfg ... Found linux image: /boot/vmlinuz-3.2.0-26-generic Found initrd image: /boot/initrd.img-3.2.0-26-generic Found linux image: /boot/vmlinuz-3.2.0-25-generic Found initrd image: /boot/initrd.img-3.2.0-25-generic Found linux image: /boot/vmlinuz-3.2.0-24-generic Found initrd image: /boot/initrd.img-3.2.0-24-generic Found linux image: /boot/vmlinuz-3.2.0-23-generic Found initrd image: /boot/initrd.img-3.2.0-23-generic Found linux image: /boot/vmlinuz-3.0.0-17-generic Found initrd image: /boot/initrd.img-3.0.0-17-generic Found memtest86+ image: /boot/memtest86+.bin I went to "additional drivers" and saw some updates available there, but an attempt to install them failed, leaving the following in /var/log/jockey.log (long log, pasted here). The full log won't fit in the question, so I'm showing $ fgrep 'ERROR' /var/log/jockey.log 2012-06-30 17:29:57,897 WARNING: modinfo for module vmxnet failed: ERROR: modinfo: could not find module vmxnet 2012-06-30 17:29:57,937 WARNING: modinfo for module wl failed: ERROR: modinfo: could not find module wl 2012-06-30 17:29:58,072 WARNING: modinfo for module nvidia_96 failed: ERROR: modinfo: could not find module nvidia_96 2012-06-30 17:29:58,240 WARNING: modinfo for module nvidia_current failed: ERROR: modinfo: could not find module nvidia_current 2012-06-30 17:29:58,293 WARNING: modinfo for module nvidia_current_updates failed: ERROR: modinfo: could not find module nvidia_current_updates 2012-06-30 17:29:58,351 WARNING: modinfo for module nvidia_173_updates failed: ERROR: modinfo: could not find module nvidia_173_updates 2012-06-30 17:29:58,385 WARNING: modinfo for module nvidia_173 failed: ERROR: modinfo: could not find module nvidia_173 2012-06-30 17:29:58,420 WARNING: modinfo for module nvidia_96_updates failed: ERROR: modinfo: could not find module nvidia_96_updates 2012-06-30 17:29:58,455 WARNING: modinfo for module ath_pci failed: ERROR: modinfo: could not find module ath_pci 2012-06-30 17:29:58,478 WARNING: modinfo for module fglrx_updates failed: ERROR: modinfo: could not find module fglrx_updates 2012-06-30 17:29:58,531 WARNING: modinfo for module fglrx failed: ERROR: modinfo: could not find module fglrx 2012-06-30 17:29:58,588 WARNING: modinfo for module omapdrm_pvr failed: ERROR: modinfo: could not find module omapdrm_pvr 2012-06-30 17:29:59,537 WARNING: modinfo for module nvidia_current failed: ERROR: modinfo: could not find module nvidia_current 2012-06-30 17:29:59,613 WARNING: modinfo for module nvidia_173_updates failed: ERROR: modinfo: could not find module nvidia_173_updates 2012-06-30 17:29:59,686 WARNING: modinfo for module nvidia_173 failed: ERROR: modinfo: could not find module nvidia_173 2012-06-30 17:29:59,764 WARNING: modinfo for module nvidia_current_updates failed: ERROR: modinfo: could not find module nvidia_current_updates 2012-06-30 17:30:29,544 WARNING: modinfo for module nvidia_current_updates failed: ERROR: modinfo: could not find module nvidia_current_updates 2012-06-30 17:30:29,545 ERROR: XorgDriverHandler.enable(): package or module not installed, aborting I'm not sure if it's a bug, as the first log shows some errors. What can I try?

    Read the article

  • Cannot establish maximum resolution on ASUS PB278Q

    - by dentuzhik
    I've recently bought brand new ASUS PB278Q monitor. When trying to connect to my laptop, everything works great, except that I can't get the native resolution of my monitor (2560x1440) working. The automatic is 1920x1080. My graphic card is Nvidia GeForce 320m. Here's output from lspci for it: ~$ lspci | grep VGA 02:00.0 VGA compatible controller: NVIDIA Corporation GT216M [GeForce GT 320M] (rev a2) and also xrandr: ~$ xrandr Screen 0: minimum 8 x 8, current 3286 x 1437, maximum 8192 x 8192 VGA-0 disconnected (normal left inverted right x axis y axis) LVDS-0 connected primary 1366x768+0+669 (normal left inverted right x axis y axis) 344mm x 193mm 1366x768 60.0*+ HDMI-0 connected 1920x1080+1366+0 (normal left inverted right x axis y axis) 600mm x 340mm 1920x1080 60.0*+ 59.9 50.0 30.0 25.0 24.0 60.0 50.0 1680x1050 60.0 1440x900 59.9 1280x1024 75.0 60.0 1280x960 60.0 1280x800 59.8 1280x720 60.0 59.9 50.0 1152x864 75.0 1024x768 75.0 70.1 60.0 800x600 75.0 72.2 60.3 56.2 720x576 50.0 720x480 59.9 640x480 75.0 59.9 59.9 480x576 50.0 480x480 59.9 I have proprietary drivers installed on my machine, here's the info about the monitor from nvidia-settings (Actually I don't have enough reputation to post images, so here's the text): Chip Location: Internal Signal: TDMS Connection link: Single Native resolution: 2560x1440 Refresh rate: 60.00 Hz The monitor is connected to laptop via HDMI cable, and honestly I have no idea what version it is, and what version is my HDMI output of my graphics card. I tried to find how I can figure it out on the web, but had no luck. Also my video card has only VGA and HDMI outs so I can't test neither DVI-D cable nor DisplayPort. So apparently, there's some problem over there. At least I want to know exactly what's going on. I've tried to see if it a linux-specific problem, but windows also gave me the same resolution by default. What I've already tried: Connect through VGA (stupid one, of course it gave me 1920x1080). Checked two HDMI cables (not sure if they're the same or not, as mentioned above). Played around with xrandr and adding custom modes. Didn't help. Surfed for the info a lot on the web, but couldn't get appropriate results. Actually xrandr gives me the following: ~$ cvt 2560 1440 60 # 2560x1440 59.96 Hz (CVT 3.69M9) hsync: 89.52 kHz; pclk: 312.25 MHz Modeline "2560x1440_60.00" 312.25 2560 2752 3024 3488 1440 1443 1448 1493 -hsync +vsync ~$ xrandr --newmode "2560x1440_60.00" 312.25 2560 2752 3024 3488 1440 1443 1448 1493 -hsync +vsync ~$ xrandr Screen 0: minimum 8 x 8, current 3286 x 1437, maximum 8192 x 8192 VGA-0 disconnected (normal left inverted right x axis y axis) LVDS-0 connected 1366x768+0+669 (normal left inverted right x axis y axis) 344mm x 193mm 1366x768 60.0*+ HDMI-0 connected primary 1920x1080+1366+0 (normal left inverted right x axis y axis) 600mm x 340mm 1920x1080 60.0*+ 59.9 50.0 30.0 25.0 24.0 60.0 50.0 1680x1050 60.0 1440x900 59.9 1280x1024 75.0 60.0 1280x960 60.0 1280x800 59.8 1280x720 60.0 59.9 50.0 1152x864 75.0 1024x768 75.0 70.1 60.0 800x600 75.0 72.2 60.3 56.2 720x576 50.0 720x480 59.9 640x480 75.0 59.9 59.9 480x576 50.0 480x480 59.9 2560x1440_60.00 (0x34f) 312.2MHz h: width 2560 start 2752 end 3024 total 3488 skew 0 clock 89.5KHz v: height 1440 start 1443 end 1448 total 1493 clock 60.0Hz ~$ xrandr --addmode HDMI-0 2560x1440_60.00 X Error of failed request: BadMatch (invalid parameter attributes) Major opcode of failed request: 140 (RANDR) Minor opcode of failed request: 18 (RRAddOutputMode) Serial number of failed request: 29 Current serial number in output stream: 30 What I intend to do next: Try another HDMI cable? Try HDMI to DVI-D cable? Try HDMI to DisplayPort cable? Another type of adapters? VGA to DVI-D? Buy another laptop with another graphic card. Damn. My ideas pretty much end here. Any ideas? Any explanations why it isn't working are appreciated.

    Read the article

  • How do I capture a WinForm window to a bitmap without the caret

    - by Steve Dunn
    I've got window on a WinForm that I want to get the bitmap representation of. For this, I use the following code (where codeEditor is the control I want a bitmap representation of): public Bitmap GetBitmap( ) { IntPtr srcDC = NativeMethods.GetDC( codeEditor.Handle ) ; var bitmap = new Bitmap( codeEditor.Width, codeEditor.Height ) ; Graphics graphics = Graphics.FromImage( bitmap ) ; var deviceContext = graphics.GetHdc( ) ; bool blitted = NativeMethods.BitBlt( deviceContext, 0, 0, bitmap.Width, bitmap.Height, srcDC, 0, 0, 0x00CC0020 /*SRCCOPY*/ ) ; if ( !blitted ) { throw new InvalidOperationException( @"The bitmap could not be generated." ) ; } int result = NativeMethods.ReleaseDC( codeEditor.Handle, srcDC ) ; if ( result == 0 ) { throw new InvalidOperationException( @"Cannot release bitmap resources." ) ; } graphics.ReleaseHdc( deviceContext ) ; graphics.Dispose( ) ; The trouble is, this captures the caret if it's flashing in the window at the time of capture. I tried calling the Win32 method HideCaret before capturing, but it didn't seem to have any effect.

    Read the article

  • ubuntu 9.10 screen flickering problem on Thinkpad R31 with Intel 83830 Graphics

    - by PA
    I am trying to revitalize an old Thinkpad R31 that has the Intel 82830 graphics and only 256 MB of RAM. I have tried a Xubuntu 9.10 Live CD. After booting the screen blinks so much that it is practically unreadable. I have searched for updated IBM Thinkpad drivers but I only found drivers for Windows on the Thinkpad support web site. EDIT: I have changed the title and the description. It is not a problem of the drivers. It seems to be a problem with the kernel. See my own answer below.

    Read the article

  • GeForce 9600GT support for dual monitor setup

    - by Theo
    I have a NVIDIA GeForce 9600 GT (point of view, link) on an Asus M2V-MX motherboard. The card has one DVI, HDMI and S-Video output. The graphics card does not seem to support dual monitors: the driver control panel for the graphics card does not detect a second monitor that is attached via S-Video (which leads me to believe that it doesn't support dual monitors with DVI and the other connections). Would it be possible to use a VGA/DVI splitter to attach two monitors to the single DVI output on the graphics card? Would this allow for a dual monitor setup, or only mirroring? How do I know if the graphic card supports this? With this particular motherboard, would it be possible to use the onboard video for another monitor?

    Read the article

  • Selection between two laptops for casual gaming [closed]

    - by Prabhpreet
    I have selected two laptops that meet my budget. Here are the differences: Laptop #1: 4GB Ram, Intel Core i5 2450M 2nd Gen processor w/ 2.5 Ghz clockspeed NVIDIA GeForce GT 520MX DDR3 1 GB Dedicated Graphics 750 GB SATA II Hard disk USB 2.0 Ports 6 hrs battery life Laptop #2: 4 GB Ram, Intel Core i5 3210M 3rd Gen processor w/ 2.5 Ghz clockspeed Integrated Intel HD Graphics 4000 500 GB SATA Hard disk USB 3.0 Ports 3 hrs battery life (This concerns me) First and foremost, does dedicated graphics matter for a casual gamer like me? Secondly, does the generation of the processors make a difference despite the same clockspeed. Thirdly, do usb 3.0 ports make a difference? And lastly, which laptop is more future proof? Please help me out. Thanks!

    Read the article

< Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >