Search Results

Search found 6703 results on 269 pages for 'amd graphics'.

Page 57/269 | < Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >

  • Identifying the version of ATI Catalyst drivers

    - by Snark
    I have bought a new video card based on the ATI Radeon HD 5670 chipset. I couldn't make it work with the latest ATI Catalyst drivers found on their website, only the drivers found on the CD delivered with the card worked. How can I know the version of Catalyst that is installed on my PC (running Windows 7 64-bits)? The ATI Catalyst Control Center returns the following information: Driver Packaging Version 8.673-091110a-092263C Provider ATI Technologies Inc. 2D Driver Version 8.01.01.973 2D Driver File Path /REGISTRY/MACHINE/SYSTEM/ControlSet001/Control/CLASS/{4D36E968-E325-11CE-BFC1-08002BE10318}/0000 Direct3D Version 8.14.10.0708 OpenGL Version 6.14.10.9120 Catalyst™ Control Center Version 2009.1110.2225.40230 I do not recognize anything pointing to a "marketing version". The website says the current version of Catalyst is 10.2.

    Read the article

  • Third Monitor (Mini Display Port, Display Port)

    - by muhsing
    I have got a Hd 6950(2x minidisplay port 2x DVI port 1x HDMI port) graphic cards. And I have got a three monitor. I want to active it all. I learned to need an active minidisplay port to vga or minidisplay port to dvi cable. Here's my problem. I want to buy minidisplay port to dvi but my country doesn't have it but I can find display port to active dvi cable. I have a minidisplay port to display port cable. If I buy Sapphire ACTIVE DP TO SL-DVI CABLE will my third monitor work? I mean I will plug mini display port to display port cable first and after I will plug active dp to dvi cable. Will it work? If someone help me I will be very grateful. I have this And I will buy this After this processing Will my third monitor work with eyefinity? Sorry for my English. I hope you will understand me. Take care.

    Read the article

  • My 3D Games crashing in these scenarios

    - by desaivv
    I have a situation here which I am unable to solve. I bought a PC last year March, here are my specs: Intel Core i3 550 @ 3GHz 4 GB RAM @400 MHz XFX GeForce 9500 GT graphics card 1GB @550MHz 500 GB HDD Lately as soon as I load my save game of Skyrim, it crashes. I have been playing Skyrim since I joined Gaming.SE site. Crashes as in the entire scene gets red lines. I can not ALT + TAB back or even CTRL + ALT + DEL either. My only recourse is a hard reset via the power button. Can not take a screen shot either. I have the latest Forceware 296.10 drivers also. This has been happening since the last 2 weeks. I always use Driver Sweeper to clean my old drivers, since that is what XFXForce recommends before installing new drivers. I installed MSI Afterburner lately to see my GPU temperature. My GPU is default, never over-clocked it. In MSI's Afterburner, I can not adjust fan speed. It is greyed out. Also in settings there is no fan tab. With normal Internet browsing it stays at 51 C. Ran Memtest86 over night with level 11. Took about 13 hours, but no errors in my RAM. I even re-installed my OS, with just the 296 drivers. The fan for the GPU does come on. I can play Diablo 2. I can not get past Warcraft 3's menu selection. There WAS some dust in my machine, but I always try and keep everything clean, since in my home town dust is an issue. Always keep cool my entire PC cabinet. My friend came with his functioning graphics card, we bought our PCs at the same time with exact same specifications. His card did not work either. Same problem with the scene freezing with red lines. I did do my research before posting here. That is how I was able to learn about MSI Afterburner, Driver Sweeper, SpeedFan etc. I followed posts on Tom's Hardware too regarding people that had similar problems. One person suggested and was followed by worked as well the suggestion to "Bake the card in an oven". Since I bought it, played Diablo 2 for months, Starcraft 2 campaign for months and Skyrim recently for months. Bought ME3 also. I am at my wits end. I do not know what else to do. I can go out and buy a card, but my friend's card did not work either. I can use the machine for Eclipse or VS2010 development just fine. Just not with 3D gaming. I originally posted this question on Gaming.SE But I was directed here. I have browsed the SU database for my problem and found this, this, and this. But none of these cover my question. My machine is only one year old. Can some experienced superuser(s) shed some light on this scenario? Is it a 3D graphics card problem? Will a brand new card work? What else can I try to pin point the problem? Can it be the Motherboard? Thanks.

    Read the article

  • Should I bother upgrading my Opteron 270 Server?

    - by MousePad
    I have an Opteron Server machine (in a large workstation class case) running on the Tyan 2895 motherboard. It's a dual CPU socket board, but I only have one 270 in there. I have 4GB of RAM, but less than 3GB is addressable, even in 64bit mode, due to the way the board is designed. Is it worth spending a few hundred on an additional CPU and maybe some more RAM? The other problem is that one of the two SATA ports on the board had its wire socket break off. So only one drive can be run as of now. I could have it repaired, but at what cost? Add in the fact that the power supply is gunked up with dust and it's a bit of a nightmare. I actually work about it getting too hot. Seems that for the money I could buy a new server rack from Dell, but it also seems a shame to waste an otherwise working, and for my needs still very fast machine.

    Read the article

  • Does the Acer Ferrari One support hardware assisted virtualization?

    - by cmeerw
    Does anyone know if the Acer Ferrari One supports hardware assisted virtualization (it should easy to find out by running Microsoft's Hardware-Assisted Virtualization Detection Tool on that machine). There is lot's of speculation around the web, but I haven't found a definitive answer yet (and would like to confirm that this feature will be working before buying one).

    Read the article

  • How do I get my XFX Radeon HD 7870 to work with 3 monitors using HDMI, DVI, and Mini DisplayPort?

    - by user88792
    I bought a XFX Double D FX-787A-CDFC Radeon HD 7870 and set it up using the HDMI, DVI and Mini DisplayPort (using an Apple mDP-to-VGA adapter). I hooked it up, installed the drivers and rebooted. The image came up on the third monitor in a weird resolution. Why did this happen and how can I fix it? I am using Windows 7 Ultimate and the system is completely updated. Side note: When I disconnect the Mini DisplayPort adapter it works fine, however for my work I need 3-4 monitors.

    Read the article

  • How to get Nvidia graphics working on Sony Z laptop?

    - by projectshave
    I have an older Sony VAIO Z 590 laptop with switchable graphics between Intel and Nvidia GeForce 9300M. It is NOT Optimus. I did a clean install of Ubuntu 12.04. Everything works, but it's using Unity 2D with the Intel drivers. I've tried loading the Nvidia drivers from "Additional Drivers", but it says "this driver is activated but not currently in use". When I run "nvidia-settings", an error window pops up to say "You do not appear to be using the NVIDIA X drivers." "lspci" shows both graphics cards. Let me know if I should add more info. How do I get the Nvidia graphics and Unity 3D working? More info: $ lshw -short -class display H/W path Device Class Description ============================================== /0/100/1/0 display G98 [GeForce 9300M GS] /0/100/2 display Mobile 4 Series Chipset Integrated Graphics C $ glxinfo name of display: :0 Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Error: couldn't find RGB GLX visual or fbconfig Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Excerpts from Xorg.0.log: [ 16.373] (II) LoadModule: "glx" [ 16.373] (II) Loading /usr/lib/x86_64-linux-gnu/xorg/extra-modules/libglx.so [ 16.386] (II) Module glx: vendor="NVIDIA Corporation" [ 16.386] compiled for 4.0.2, module version = 1.0.0 [ 16.386] Module class: X.Org Server Extension [ 16.386] (II) NVIDIA GLX Module 295.49 Tue May 1 00:09:10 PDT 2012 [ 16.608] (II) NVIDIA dlloader X Driver 295.49 Mon Apr 30 23:48:24 PDT 2012 [ 16.608] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs [ 17.693] (EE) Failed to initialize GLX extension (Compatible NVIDIA X driver not found)

    Read the article

  • Nvidia GTS 360M in Asus laptop shuts off when gaming on external display

    - by mr odus
    I have had this problem on my Asus G51j Intel I5, Nvidia GTS360M graphics card. Just updated the drivers to latest off Nvidias site. Most games I play, If I hook it up through an external monitor(this happens whether it's HDMI or VGA). The laptop hard shuts down after about 20 minutes, give or take. This tends to happen on more graphically intense games like Call of dutys, bioshock. I'm running Windows 7, latest nvidia drivers. Games work fine on my laptops screen, and Movies, general computing work fine on the external displays. The laptop always sits on a cooling pad and the latest time was in front of my AC unit, I ran heatfan or whatever the heat tracking software was, and my temperatures were normal through a shut down. This has been happening for the life of the laptop. I dont play very many games, and even fewer on an external monitor, so this issue doesnt come up much Is it possible I have a faulty graphics card? Is there anything else I can try?

    Read the article

  • Computer restarts without warning; code bcc116

    - by Robert C.
    Processor: Intel i5 4430 4-Core 4x3Ghz Motherboard: msi h87-g41 Graphics Card: Nvidia GTX760 Power supply: eps-750 cm RAM: 8GB I bought a new assembled gaming PC which worked fine for a few days. Then it started rebooting without warning. After it restarts windows 7 gives me an bbc 116 error code. Apparently it's something to do with my video card, either it overheating or wrong drivers. I've installed the latest driver from Nvidia for my graphics card. Since it's brand new it can't be dust, I'm running it with its lid open to see if the problem persists. I'm also running prime95 now to see if it tells me anything else. Using core temp it tells me that my CPU reaches up to 95° celsius with the blend stress test from prime95. Aaaand it just peaked to 100°. Of course it doesn't reach these temperatures at all while idle/gaming. I'm gonna let prime95 run for a night and to see what happens. Until then does anyone know what I should do next?

    Read the article

  • calling concurrently Graphics.Draw and new Bitmap from memory in thread take long time

    - by Abdul jalil
    Example1 public partial class Form1 : Form { public Form1() { InitializeComponent(); pro = new Thread(new ThreadStart(Producer)); con = new Thread(new ThreadStart(Consumer)); } private AutoResetEvent m_DataAvailableEvent = new AutoResetEvent(false); Queue<Bitmap> queue = new Queue<Bitmap>(); Thread pro; Thread con ; public void Producer() { MemoryStream[] ms = new MemoryStream[3]; for (int y = 0; y < 3; y++) { StreamReader reader = new StreamReader("image"+(y+1)+".JPG"); BinaryReader breader = new BinaryReader(reader.BaseStream); byte[] buffer=new byte[reader.BaseStream.Length]; breader.Read(buffer,0,buffer.Length); ms[y] = new MemoryStream(buffer); } while (true) { for (int x = 0; x < 3; x++) { Bitmap bmp = new Bitmap(ms[x]); queue.Enqueue(bmp); m_DataAvailableEvent.Set(); Thread.Sleep(6); } } } public void Consumer() { Graphics g= pictureBox1.CreateGraphics(); while (true) { m_DataAvailableEvent.WaitOne(); Bitmap bmp = queue.Dequeue(); if (bmp != null) { // Bitmap bmp = new Bitmap(ms); g.DrawImage(bmp,new Point(0,0)); bmp.Dispose(); } } } private void pictureBox1_Click(object sender, EventArgs e) { con.Start(); pro.Start(); } } when Creating bitmap and Drawing to picture box are in seperate thread then Bitmap bmp = new Bitmap(ms[x]) take 45.591 millisecond and g.DrawImage(bmp,new Point(0,0)) take 41.430 milisecond when i make bitmap from memoryStream and draw it to picture box in one thread then Bitmap bmp = new Bitmap(ms[x]) take 29.619 and g.DrawImage(bmp,new Point(0,0)) take 35.540 the code is for Example 2 is why it take more time to draw and bitmap take time in seperate thread and how to reduce the time when processing in seperate thread. i am using ANTS performance profiler 4.3 public Form1() { InitializeComponent(); pro = new Thread(new ThreadStart(Producer)); con = new Thread(new ThreadStart(Consumer)); } private AutoResetEvent m_DataAvailableEvent = new AutoResetEvent(false); Queue<MemoryStream> queue = new Queue<MemoryStream>(); Thread pro; Thread con ; public void Producer() { MemoryStream[] ms = new MemoryStream[3]; for (int y = 0; y < 3; y++) { StreamReader reader = new StreamReader("image"+(y+1)+".JPG"); BinaryReader breader = new BinaryReader(reader.BaseStream); byte[] buffer=new byte[reader.BaseStream.Length]; breader.Read(buffer,0,buffer.Length); ms[y] = new MemoryStream(buffer); } while (true) { for (int x = 0; x < 3; x++) { // Bitmap bmp = new Bitmap(ms[x]); queue.Enqueue(ms[x]); m_DataAvailableEvent.Set(); Thread.Sleep(6); } } } public void Consumer() { Graphics g= pictureBox1.CreateGraphics(); while (true) { m_DataAvailableEvent.WaitOne(); //Bitmap bmp = queue.Dequeue(); MemoryStream ms= queue.Dequeue(); if (ms != null) { Bitmap bmp = new Bitmap(ms); g.DrawImage(bmp,new Point(0,0)); bmp.Dispose(); } } } private void pictureBox1_Click(object sender, EventArgs e) { con.Start(); pro.Start(); }

    Read the article

  • Reducing video mode switching during Linux boot

    - by Zack
    When I boot up my desktop computer, which only has Linux on it, the video mode and/or console font gets switched four times: When GRUB starts, it switches from 80x25 text to a graphical mode so it can draw a pretty background behind its menu; GRUB then goes back to 80x25 text after I pick something from the menu; When the KMS driver for my video card loads, it switches to a much higher-resolution text mode (I don't know if this is a hardware text mode or not); Finally X starts and it goes graphics and stays that way. I think this last switch does not change the resolution of the video mode, only the graphicalness. I'd like to get rid of as many of these mode switches as possible. Ideally, when GRUB takes over from the BIOS it would go directly to the same high-resolution text mode that the KMS driver selects, and the display would stay in that mode till X starts and brings up graphics. I am under the impression that this is possible by mucking with the kernel command line and/or the GRUB console module load parameters, but I don't know the details. GRUB 1.98+20100706, kernel 2.6.32.15 using Nouveau video drivers. Distro is Debian unstable. Please no answers that involve recompiling anything or cobbling together bleeding-edge kernel/driver combinations, I don't care enough about this to go to that much trouble. EDIT: Tobu suggests setting GRUB_GFXMODE to the full pixel resolution of the monitor, and GRUB_GFXPAYLOAD_LINUX=keep to avoid the mode switch after the menu goes away. This does part of what I want, but winds up being worse overall. There's no mode switch after the menu, but there's still a painfully-slow screen repaint (I should probably just give up on GRUB's gfxmode, it's waaaay too slow at 1920x1200). More seriously, there's now a double mode switch when nouveaufb loads, along with fun-looking error messages in dmesg [ 5.923798] [drm] nouveau 0000:02:00.0: allocated 1920x1200 fb: 0x40250000, bo ffff8801ba5f4600 [ 5.923802] fb: conflicting fb hw usage nouveaufb vs EFI VGA - removing generic driver [ 5.923821] [drm] nouveau 0000:02:00.0: PFIFO_INTR 0x00000010 - Ch 1 ("PFIFO_INTR" message repeats 400+ times) [ 5.925609] Console: switching to colour dummy device 80x25 [ 5.925802] Console: switching to colour frame buffer device 240x75

    Read the article

  • How I can fix the "Display Driver has stopped responding and has recovered"?

    - by Vitor Rangel
    I'm using a GeForce GTX 580, with Windows 7 64-bit. The driver version of the GTX is 301.42. The problem happens after a few minutes, when I'm playing specifc games. It won't happen in all games - And I don't have any idea why these games doesn't work. The games that doesn't work: Battlefield 3, Civilization V, Sniper Elite V2. The games that work: Mass Effect 3, Crysis 2, Team Fortress 2, Left 4 Dead 2, Skyrim, L.A. Noire. As you can see, it's not a problem of "The games that demand more stop working". I've tryed updating the driver of the graphics-card, the bios of the motherboard, even formated my computer (It was needing it) and instaled every driver in the last version possible. This problem happens since I bought my graphics-card, 6 months ago. After a few minutes, from 10 to 20, the pixels in the monitor become strange, with random colors and effects, like it was broken. Then, everything goes black, and the message appears "Display Driver has stopped responding and has recovered". After that, I need to close the game and start again. I am not overclocking, and my temperature never goes higher than 70ºC.

    Read the article

  • Viewing Postscript (or PDF) on OS X: Aliasing issues

    - by mankoff
    I am generating postscript graphics and am trying to find a balance between non-aliasing and over-aliasing. If I use the raw ghostscript viewer gs on the Postscript, it looks good. The text appears anti-aliased, but the image remains nice and blocky. Unfortunately, gs has no real user interface and loses all of the nice things that Preview.app has. I could install gv, but the dependency bloat is huge! It requires all of gnome. And even that isn't a great viewer compared to Preview.app or Skim.app. Here is an image viewed with gs: From a user-interaction and Mac-ish perspective, Preview.app (or Skim.app is a much nicer program to use. They have the option to turn on or off aliasing, but neither option looks very good. Which aliasing on, the image is blurry. When it is off, the graphic matches what is seen from gs, but there are two issues. Minor issue: the font is ugly. Uglier than with gs. Major issue: Every PDF is un-aliased, making it hard to read regular PDFs full of text. So, in summary: Is there a way to manually generate the PDF from the PS that overcomes these issues? Is there a way to find a middle ground of alias/unalias with Preview.app? Is there another app that displays with quality like gs, but has a decent UI like Skim.app or Preview.app Is there a way to have Preview.app turn off aliasing for only one file (containing graphics) but leave it enabled in general so that text PDFs are still readable?

    Read the article

  • Windows 7 x64 support for Intel GMA 3650 (or GMA 3600)

    - by Loom
    I recently purchased an Intel D2700 MUD motherboard and I cannot find drivers for the Win7 x64 integrated graphics (Intel GMA 3650 aka PowerVR sgs545). The accompanying CD contains Win7 x32 version only. When I run it I got an error: This computer does not meet the minimum requirements for installing the software. I tried to use online utility Intel Driver Update Utility Graphics. I used Chrome, Firefox, Internet Explorer without success. First, UAC prompt appear, and then endlessly spinning progress-bar with text "Analyzing computer...". The text in UAC prompt is: Program file name: System Requirements Lab Verified publisher: Husdawg, LLC I downloaded this utility (intel_srldetect_4.5.5.0) and started it from my hard disk. I got an error: A network error occured while attempting to read from the file: C:\Users\Loom\Downloads\SystemRequirementsLab_intel_4.5.5.0.msi Standard VGA driver works for this video card but without hardware acceleration: Hardware acceleration is either disabled or not supported by your video card driver, which could slow game performance. Make sure you have the latest video card driver installed and that hardware acceleration is turned on. Where I can get appropriate driver?

    Read the article

  • Should I switch my graphics mode in the BIOS to avoid using Bumblebee?

    - by Fawkes5
    I have just purchased a Acer 3830TG, the timeline X series. To my surprise I found out that there is no first-party support for nvidia optimus for linux. Bumblebee works great, but the battery life from the graphics card always running is not so great. I don't use linux for games so i don't really need the graphics card on, I have Windows for that. In my bios, I have the ability to change my graphics mode from switchable to integrated. If I do this, reinstall ubuntu, what will happen? Will my nvidia card just turn off? Will everything work properly, as if i'm not running an optimus laptop? Is this recommended as opposed to dealing with bumblebee? What is the best thing I could do?

    Read the article

  • Why am I stuck at 640x480 on an Optimus hybrid-graphics system?

    - by exilada
    I have Intel HD 3000 graphics card onboard and nvidia 520 mx optimus techolonogy card. I was try to install Nvidia driver but it was failure. Now I cant use anything. Have one resolution 640x480 every media disconnected and I cant connect $ xrandr Screen 0: minimum 320 x 200, current 640 x 480, maximum 8192 x 8192 LVDS1 connected 640x480+0+0 (normal left inverted right x axis y axis) 344mm x 194mm 640x480 59.9* VGA1 disconnected (normal left inverted right x axis y axis) HDMI1 disconnected (normal left inverted right x axis y axis) DP1 disconnected (normal left inverted right x axis y axis) lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09 ) glxinfo | grep vendor server glx vendor string: SGI client glx vendor string: Mesa Project and SGI OpenGL vendor string: Tungsten Graphics, Inc Blockquote something wrong here I guess I try to some solutions but didnt work even it cant nvidia-xconfig file after these By the way system get eror sometimes about xorg Sorry for my English thaks for help.

    Read the article

  • How To Draw More Precise Lines using Core Graphics and CALayer

    - by user308444
    Hello I am having a hard time making this UI element look the way I want (see screenshot). Notice the image on the right--how the line width and darkness looks inconsistent compared to the image on the left (which happens to be a screen grab from safari) where the border width is more consistent. How does apple make their lines so perfect? I'm using a CALayer and the Core Graphics API to draw the image on the right. Is it possible to draw such perfect lines with the standard apis?

    Read the article

  • Qt Graphics Scene mouse event propagation

    - by Olorin
    hello i'm learning qt and i'm doing the folowing to add some widgets to a graphics scene void MainWindow::addWidgets(QList<QWidget *> &list, int code) { if(code == CODE_INFO) { QWidget *layoutWidget = new QWidget(); QVBoxLayout *layout = new QVBoxLayout(); foreach(QWidget *w, list) { layout->addWidget(w); this->connect(((ProductInfo*)w), SIGNAL(productClicked()), this, SLOT(getProductDetails())); } layoutWidget->setLayout(layout); this->scene->addWidget(layoutWidget); } } my ProductInfo class processes mouse release and emits a signal void ProductInfo::mouseReleaseEvent(QMouseEvent *e) { QWidget::mouseReleaseEvent(e); emit productClicked(); } the problem is after adding the widgets to the scene they no longer get the mouse release event and don't emit productClicked signal but if i add them to the main window(not to the scene) they work as expected. What am i doing wrong?

    Read the article

  • Flex: Why is line obscured by canvas' background

    - by mauvo
    I want MyCanvas to draw lines on itself, but they seem to be drawn behind the background. What's going on and how should I do this? Main.mxml <?xml version="1.0" encoding="utf-8"?> <mx:Application xmlns:mx="http://www.adobe.com/2006/mxml" xmlns:my="*"> <my:MyCanvas width="300" height="300" id="myCanvas"></my:MyCanvas> <mx:Button label="Draw" click="myCanvas.Draw();"></mx:Button> </mx:Application> MyCanvas.as package { import mx.containers.Canvas; public class MyCanvas extends Canvas { public function MyCanvas() { this.setStyle("backgroundColor", "white"); } public function Draw():void { graphics.lineStyle(1); graphics.moveTo( -10, -10); graphics.lineTo(width + 10, height + 10); } } } Thanks.

    Read the article

  • Actionscript: Why is drawRoundRectComplex() not documented?

    - by Chunk1978
    in studying actionscript 3's graphics class, i've come across the undocumented drawRoundRectComplex() method. it's a variant of drawRoundRect() but with 8 parameters, the final four being the diameter of each corner (x, y, width, height, top left, top right, bottom left, bottom right). //example var sp:Sprite = new Sprite(); sp.graphics.lineStyle(1, 0x000000); sp.graphics.drawRoundRectComplex(0, 0, 100, 50, 10, 20, 0, 10); addChild(sp); this seems to be a pretty useful method, so i'm just curious if anyone knows of any reasons why adobe chose not to document it?

    Read the article

  • Sex appear of computer graphics: movie like UI systems

    - by anon
    It's well know that 1) the way computers actually work 2) the way computers are protrayed in movies are not the same. In particular (2) looks much much cooler than (1). Where can I learn more about making flashy, superficially useful but deepdown useless fancy graphics UIs like that? It's almost in the realm of "hollywood special effects" -- like fire/smoke/fire, but I don't want natural phenomenon; I want user interfaces. Concrete question: where can I learn about creating flashy, cool looking (though not necessairly useful) user interfaces? [Perferably in OpenGL]

    Read the article

  • Loop colours from variables for graphics.py [Python 3.2]

    - by user1056548
    I am creating a graphics program that draws 100 x 100 squares next to each other depending on the user-specified grid size. The user also inputs 4 colours for the squares to be coloured (e.g. if they enter red,green,blue,yellow the squares will be coloured in that order, repeating the colours). Is it possible to loop the colours from the variables the user has given? Here is what I have so far: def main(): print ("Please enter four comma seperated colours e.g.: 'red,green,blue,yellow'\n\ Allowed colours are: red, green, blue, yellow and cyan") col1, col2, col3, col4 = input("Enter your four colours: ").split(',') win = GraphWin ("Squares", 500, 500) colours = [col1, col2, col3, col4] drawSquare (win, col1, col2, col3, col4, colours) win.getMouse() win.close() def drawSquare(win, col1, col2, col3, col4, colours): for i in range (4): for j in range (len(colours)): colour = colours[j] x = 50 + (i * 50) circle = Circle (Point (x,50), 20) circle.setFill(colour) circle.draw(win) I think I should be using a list in some way, but can't work out exactly how to do it. Can anybody help?

    Read the article

  • Which C++ graphics library should I use?

    - by mspoerr
    Hello, I found the following graphics libraries, but I am not sure which one I should use. Maybe there are some more... Graphviz (http://www.graphviz.org/) Boost Graph Library (http://www.boost.org/doc/libs/1_42_0/libs/graph/doc/index.html) Lemon (http://lemon.cs.elte.hu/trac/lemon) igraph (http://igraph.sourceforge.net/introduction.html) What it should do: draw a undirected network map come as header only or static lib for Windows the output format should be user editable Graphviz is the only one I tried so far, but I found no static lib for it, I failed to build it by my own and the documentation could be better. Therefore I looked around and found these other three libs. I would be glad to get some recommendations which lib to choose. Thanks, /mspoerr

    Read the article

< Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >