Search Results

Search found 5806 results on 233 pages for 'graphics'.

Page 86/233 | < Previous Page | 82 83 84 85 86 87 88 89 90 91 92 93  | Next Page >

  • Why don't more games use vector art?

    - by Parris
    It would seem to me that vector art is more efficient in terms of resources/scalability; however, in most cases I have seen artists using bitmap/rasterized art. Is this a limitation put on the artists by the game programmers/designers? As a programmer I think vector art would be more ideal, since it allows for scaling up resolution without having to recreate the art, creating really large graphics or causing graphics to become blurry. The questions: why aren't more people using SVG/AI to create 2D game art? Would it actually be preferred (and who prefers it)? Are bitmap graphics a standard or a limitation (or maybe neither)? Background: I am working on an engine, and I had some kinda cool ideas for vector based graphics; however, I don't want to piss off artists in the future. I guess this is more a question centered around pragmatism and developing games.

    Read the article

  • Ubuntu 12.10 stucks in a Login Loop

    - by Calvin Wahlers
    My problem: As you can guess my Ubuntu 12.10 stucks in a login loop when trying to enter my desktop. Means the screen gets black and soon after that the login screen comes back. I'm a Ubuntu Newbie so if there's any answer please explain in a simply understandable language :) I've already read that the problem might be caused by an error depending on the graphics, so I post my graphics to: My graphics: ATI Radeon 7670M Hope you can help me, thank you ;)

    Read the article

  • Game component causes game to freeze

    - by ChocoMan
    I'm trying to add my camera component to Game1 class' constructor like so: Camera camera; // from class Camera : GameComponent .... public Game1() { graphics = new GraphicsDeviceManager(this); this.graphics.PreferredBackBufferWidth = screenWidth; this.graphics.PreferredBackBufferHeight = screenHieght; this.graphics.IsFullScreen = true; Content.RootDirectory = "Content"; camera = new Camera(this); Components.Add(camera); } From the just adding the last two lines, when I run the game, the screen freezes then gives me this message: An unhandled exception of type 'System.ComponentModel.Win32Exception' occurred in System.Drawing.dll Additional information: The operation completed successfully

    Read the article

  • Can't install "cedar trail drm driver in DKMS format" on Ubuntu 12.04

    - by Mychal Phillip Segala Sajulga
    Ubuntu 12.04 32bit ... Toshiba NB520 *side-note, this computer is so slow even with a 2gbram; far better than my emachine and neo laptop. I think this is the answer: driver. /var/log/jockey.log 2013-09-19 05:29:36,773 DEBUG: Comparing 3.8.0-29 with 2013-09-19 05:32:45,094 DEBUG: updating <jockey.detection.LocalKernelModulesDriverDB instance at 0x8427a0c> 2013-09-19 05:32:50,861 DEBUG: reading modalias file /lib/modules/3.8.0-29-generic/modules.alias 2013-09-19 05:32:56,240 DEBUG: reading modalias file /usr/share/jockey/modaliases/b43 2013-09-19 05:32:56,265 DEBUG: reading modalias file /usr/share/jockey/modaliases/disable-upstream-nvidia 2013-09-19 05:32:56,474 DEBUG: loading custom handler /usr/share/jockey/handlers/dvb_usb_firmware.py 2013-09-19 05:32:56,791 DEBUG: Instantiated Handler subclass __builtin__.DvbUsbFirmwareHandler from name DvbUsbFirmwareHandler 2013-09-19 05:32:56,792 DEBUG: Firmware for DVB cards not available 2013-09-19 05:32:56,793 DEBUG: loading custom handler /usr/share/jockey/handlers/cdv.py 2013-09-19 05:32:56,927 WARNING: modinfo for module cedarview_gfx failed: ERROR: modinfo: could not find module cedarview_gfx 2013-09-19 05:32:58,213 DEBUG: linux-lts-raring installed: True linux-lts-saucy installed: False linux minor version: 8 xserver ABI: 13 xserver-lts-quantal: False 2013-09-19 05:32:58,214 DEBUG: Instantiated Handler subclass __builtin__.CdvDriver from name CdvDriver 2013-09-19 05:32:58,214 DEBUG: cdv.available: falling back to default 2013-09-19 05:32:58,685 DEBUG: XorgDriverHandler(cedarview_gfx, cedarview-graphics-drivers, None): Disabling as package video ABI(s) xorg-video-abi-11 not compatible with X.org video ABI xorg-video-abi-13 2013-09-19 05:32:58,686 DEBUG: Intel Cedarview graphics driver not available 2013-09-19 05:32:58,687 DEBUG: loading custom handler /usr/share/jockey/handlers/vmware-client.py 2013-09-19 05:32:58,716 WARNING: modinfo for module vmxnet failed: ERROR: modinfo: could not find module vmxnet 2013-09-19 05:32:58,717 DEBUG: Instantiated Handler subclass __builtin__.VmwareClientHandler from name VmwareClientHandler 2013-09-19 05:32:58,758 DEBUG: VMWare Client Tools availability undetermined, adding to pool 2013-09-19 05:32:58,758 DEBUG: loading custom handler /usr/share/jockey/handlers/nvidia.py 2013-09-19 05:32:58,826 WARNING: modinfo for module nvidia_304 failed: ERROR: modinfo: could not find module nvidia_304 2013-09-19 05:32:58,836 DEBUG: Instantiated Handler subclass __builtin__.NvidiaDriver304 from name NvidiaDriver304 2013-09-19 05:32:58,837 DEBUG: nvidia.available: falling back to default 2013-09-19 05:33:11,682 DEBUG: NVIDIA accelerated graphics driver availability undetermined, adding to pool 2013-09-19 05:33:11,688 WARNING: modinfo for module nvidia_304_updates failed: ERROR: modinfo: could not find module nvidia_304_updates 2013-09-19 05:33:11,696 DEBUG: Instantiated Handler subclass __builtin__.NvidiaDriver304Updates from name NvidiaDriver304Updates 2013-09-19 05:33:11,696 DEBUG: nvidia.available: falling back to default 2013-09-19 05:33:24,326 DEBUG: NVIDIA accelerated graphics driver (post-release updates) availability undetermined, adding to pool 2013-09-19 05:33:24,332 WARNING: modinfo for module nvidia_current_updates failed: ERROR: modinfo: could not find module nvidia_current_updates 2013-09-19 05:33:24,339 DEBUG: Instantiated Handler subclass __builtin__.NvidiaDriverCurrentUpdates from name NvidiaDriverCurrentUpdates 2013-09-19 05:33:24,340 DEBUG: nvidia.available: falling back to default 2013-09-19 05:33:24,381 DEBUG: NVIDIA accelerated graphics driver (post-release updates) not available 2013-09-19 05:33:24,387 WARNING: modinfo for module nvidia_experimental_304 failed: ERROR: modinfo: could not find module nvidia_experimental_304 2013-09-19 05:33:24,427 DEBUG: Instantiated Handler subclass __builtin__.NvidiaDriverExperimental304 from name NvidiaDriverExperimental304 2013-09-19 05:33:24,427 DEBUG: nvidia.available: falling back to default 2013-09-19 05:33:24,461 DEBUG: NVIDIA accelerated graphics driver (**experimental** beta) not available 2013-09-19 05:33:24,467 WARNING: modinfo for module nvidia_current failed: ERROR: modinfo: could not find module nvidia_current

    Read the article

  • How do I set the current point in a CG graphics context?

    - by Joe
    When running the code below in the iphone simulator I get the error : CGContextClosePath: no current point. Why is the current point not being set? Or is the context not set to the correct state? CGContextBeginPath(ctx); CGMutablePathRef pathHolder; pathHolder = CGPathCreateMutable(); //move to point for the initial point NSLog(@"Drawing a state point %f, %f", [[holder.points objectAtIndex:0] floatValue], [[holder.points objectAtIndex:1] floatValue]); CGPathMoveToPoint(pathHolder, NULL, [[holder.points objectAtIndex:0] floatValue], [[holder.points objectAtIndex:1] floatValue]); for(int x = 2; x < [holder.points count] - 1; x += 2) { NSLog(@"Drawing a state point %f, %f", [[holder.points objectAtIndex:x] floatValue], [[holder.points objectAtIndex:(x+1)] floatValue]); CGPathAddLineToPoint(pathHolder, NULL, [[holder.points objectAtIndex:x] floatValue], [[holder.points objectAtIndex:(x+1)] floatValue]); } CGContextClosePath(ctx); CGContextFillPath(ctx);

    Read the article

  • NVidia with Optimus conflicting in Ubuntu 12.04

    - by Humannoise
    i have recently installed Ubuntu 12.04 in a Intel Ivy Bridge with integrated graphics and NVidia GPU with Optimus tech, however i cant manage it to work properly. I have already passed by the solution of bumblebee project, however iam got the following message when try to run anything with nvidia card( e.g. with optirun firefox): [ERROR]The Bumblebee daemon has not been started yet or the socket path /var/run/bumblebee.socket was incorrect. [ERROR]Could not connect to bumblebee daemon - is it running? Since the nvidia card is not working properly, some softwares like Scilab, that make use of X11 system for graphic handling and plotting, wont work too. my bios has no option concerning graphics card and the log of daemon returned: Jul 5 16:10:51 humannoise-W251ESQ-W270ESQ bumblebeed[980]: Module 'nvidia' is not found. Jul 5 16:10:51 humannoise-W251ESQ-W270ESQ kernel: [ 17.943272] init: bumblebeed main process (980) terminated with status 1 Jul 5 16:10:51 humannoise-W251ESQ-W270ESQ kernel: [ 17.943288] init: bumblebeed main process ended, respawning Jul 5 16:10:51 humannoise-W251ESQ-W270ESQ bumblebeed[1026]: Module 'nvidia' is not found. The lspci -nn | grep '\[030[02]\]:' returned: 00:02.0 VGA compatible controller [0300]: Intel Corporation Ivy Bridge Graphics Controller [8086:0166] (rev 09) 01:00.0 VGA compatible controller [0300]: NVIDIA Corporation Device [10de:0de9] (rev a1) Ok, for the command dpkg -l | grep '^ii' | grep nvidia i got : ii bumblebee-nvidia 3.0-2~preciseppa1 nVidia Optimus support using the proprietary NVIDIA driver ii nvidia-current 302.17-0ubuntu1~precise~xup1 NVIDIA binary Xorg driver, kernel module and VDPAU library ii nvidia-current-updates 295.49-0ubuntu0.1 NVIDIA binary Xorg driver, kernel module and VDPAU library ii nvidia-settings 302.17-0ubuntu1~precise~xup3 Tool of configuring the NVIDIA graphics driver ii nvidia-settings-updates 295.33-0ubuntu1 Tool of configuring the NVIDIA graphics driver After full reinstallation, including the remove of any previous nvidia drive, lsmod | grep -E 'nvidia|nouveau' returned: nvidia 10888310 46 dmesg | grep -C3 -E 'nouveau|NVRM' returned things like: [ 1875.607283] nvidia 0000:01:00.0: PCI INT A -> GSI 16 (level, low) -> IRQ 16 [ 1875.607289] nvidia 0000:01:00.0: setting latency timer to 64 [ 1875.607293] vgaarb: device changed decodes: PCI:0000:01:00.0,olddecodes=io+mem,decodes=none:owns=none [ 1875.607363] NVRM: loading NVIDIA UNIX x86_64 Kernel Module 302.17 Tue Jun 12 16:03:22 PDT 2012 [ 1884.830035] nvidia 0000:01:00.0: PCI INT A disabled [ 1884.832058] bbswitch: disabling discrete graphics [ 1884.832960] bbswitch: Result of Optimus _DSM call: 09000019 Some programs, like Scilab, are now working fine under optirun(e.g. >optirun scilab) call. Thank you.

    Read the article

  • effect and model vertex declaration compatibility

    - by Vodácek
    I have normal model drawing code. When I try to draw model without UV coordinates I got this exception: System.InvalidOperationException: The current vertex declaration does not include all the elements required by the current vertex shader. TextureCoordinate0 is missing. at Microsoft.Xna.Framework.Graphics.GraphicsDevice.VerifyCanDraw( Boolean bUserPrimitives, Boolean bIndexedPrimitives) at Microsoft.Xna.Framework.Graphics.GraphicsDevice.DrawIndexedPrimitives( PrimitiveType primitiveType, Int32 baseVertex, Int32 minVertexIndex, Int32 numVertices, Int32 startIndex, Int32 primitiveCount) at Microsoft.Xna.Framework.Graphics.ModelMeshPart.Draw() at Microsoft.Xna.Framework.Graphics.ModelMesh.Draw() ... I know what cause the exception, but is possible to avoid it? Is possible to check model before drawing it with current shader for vertex declaration compatibility?

    Read the article

  • GameplayScreen does not contain a definition for GraphicsDevice

    - by Dave Voyles
    Long story short: I'm trying to intergrate my game with Microsoft's Game State Management. In doing so I've run into some errors, and the latest one is in the title. I'm not able to display my HUD for the reasons listed above. Previously, I had much of my code in my Game.cs class, but the GSM has a bit of it in Game1, and most of what you have drawn for the main screen in your GameplayScreen class, and that is what is causing confusion on my part. I've created an instance of the GameplayScreen class to be used in the HUD class (as you can see below). Before integrating with the GSM however, I created an instance of my Game class, and all worked fine. It seems that I need to define my graphics device somewhere, but I am not sure of where exactly. I've left some code below to help you understand. public class GameStateManagementGame : Microsoft.Xna.Framework.Game { #region Fields GraphicsDeviceManager graphics; ScreenManager screenManager; // Creates a new intance, which is used in the HUD class public static Game Instance; // By preloading any assets used by UI rendering, we avoid framerate glitches // when they suddenly need to be loaded in the middle of a menu transition. static readonly string[] preloadAssets = { "gradient", }; #endregion #region Initialization /// <summary> /// The main game constructor. /// </summary> public GameStateManagementGame() { Content.RootDirectory = "Content"; graphics = new GraphicsDeviceManager(this); graphics.PreferredBackBufferWidth = 1280; graphics.PreferredBackBufferHeight = 720; graphics.IsFullScreen = false; graphics.ApplyChanges(); // Create the screen manager component. screenManager = new ScreenManager(this); Components.Add(screenManager); // Activate the first screens. screenManager.AddScreen(new BackgroundScreen(), null); //screenManager.AddScreen(new MainMenuScreen(), null); screenManager.AddScreen(new PressStartScreen(), null); } namespace Pong { public class HUD { public void Update(GameTime gameTime) { // Used in the Draw method titleSafeRectangle = new Rectangle (GameplayScreen.Instance.GraphicsDevice.Viewport.TitleSafeArea.X, GameplayScreen.Instance.GraphicsDevice.Viewport.TitleSafeArea.Y, GameplayScreen.Instance.GraphicsDevice.Viewport.TitleSafeArea.Width, GameplayScreen.Instance.GraphicsDevice.Viewport.TitleSafeArea.Height); } } } class GameplayScreen : GameScreen { #region Fields ContentManager content; public static GameStates gamestate; private GraphicsDeviceManager graphics; public int screenWidth; public int screenHeight; private Texture2D backgroundTexture; private SpriteBatch spriteBatch; private Menu menu; private SpriteFont arial; private HUD hud; Animation player; // Creates a new intance, which is used in the HUD class public static GameplayScreen Instance; public GameplayScreen() { TransitionOnTime = TimeSpan.FromSeconds(1.5); TransitionOffTime = TimeSpan.FromSeconds(0.5); } protected void Initialize() { lastScored = false; menu = new Menu(); resetTimer = 0; resetTimerInUse = true; ball = new Ball(content, new Vector2(screenWidth, screenHeight)); SetUpMulti(); input = new Input(); hud = new HUD(); // Places the powerup animation inside of the surrounding box // Needs to be cleaned up, instead of using hard pixel values player = new Animation(content.Load<Texture2D>(@"gfx/powerupSpriteSheet"), new Vector2(103, 44), 64, 64, 4, 5); // Used by for the Powerups random = new Random(); vec = new Vector2(100, 50); vec2 = new Vector2(100, 100); promptVec = new Vector2(50, 25); timer = 10000.0f; // Starting value for the cooldown for the powerup timer timerVector = new Vector2(10, 10); //JEP - one time creation of powerup objects playerOnePowerup = new Powerup(); playerOnePowerup.Activated += PowerupActivated; playerOnePowerup.Deactivated += PowerupDeactivated; playerTwoPowerup = new Powerup(); playerTwoPowerup.Activated += PowerupActivated; playerTwoPowerup.Deactivated += PowerupDeactivated; //JEP - moved from events since these only need set once activatedVec = new Vector2(100, 125); deactivatedVec = new Vector2(100, 150); powerupReady = false; }

    Read the article

  • Why my new GTX 660m's clock drops drastically after running few seconds

    - by trVoldemort
    I bought a Lenovo Y580 laptop few days ago, this model is equipped with GTX 660m graphics card. However, the game performance is unbelievably poor since it out from the box. I realized there is something wrong with this graphics card. I downloaded GPU-z, and did a simple test. And I was shocked by the fact that my GTX 660m graphics card is running at 135.0mhz core clock. (It should be 835mhz at least!) Even the integrated graphics card "Intel HD graphics 4000" can run at 650mhz. Further examining showed that in the first few seconds GTX 660m was actually running at 835mhz, however the core temperature quickly reached 90+°C and the clock (maybe) automatically drop to 135.0mhz. This is very strange. Anyone has any idea what's going on here?

    Read the article

  • Why am I not getting an sRGB default framebuffer?

    - by Aaron Rotenberg
    I'm trying to make my OpenGL Haskell program gamma correct by making appropriate use of sRGB framebuffers and textures, but I'm running into issues making the default framebuffer sRGB. Consider the following Haskell program, compiled for 32-bit Windows using GHC and linked against 32-bit freeglut: import Foreign.Marshal.Alloc(alloca) import Foreign.Ptr(Ptr) import Foreign.Storable(Storable, peek) import Graphics.Rendering.OpenGL.Raw import qualified Graphics.UI.GLUT as GLUT import Graphics.UI.GLUT(($=)) main :: IO () main = do (_progName, _args) <- GLUT.getArgsAndInitialize GLUT.initialDisplayMode $= [GLUT.SRGBMode] _window <- GLUT.createWindow "sRGB Test" -- To prove that I actually have freeglut working correctly. -- This will fail at runtime under classic GLUT. GLUT.closeCallback $= Just (return ()) glEnable gl_FRAMEBUFFER_SRGB colorEncoding <- allocaOut $ glGetFramebufferAttachmentParameteriv gl_FRAMEBUFFER gl_FRONT_LEFT gl_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING print colorEncoding allocaOut :: Storable a => (Ptr a -> IO b) -> IO a allocaOut f = alloca $ \ptr -> do f ptr peek ptr On my desktop (Windows 8 64-bit with a GeForce GTX 760 graphics card) this program outputs 9729, a.k.a. gl_LINEAR, indicating that the default framebuffer is using linear color space, even though I explicitly requested an sRGB window. This is reflected in the rendering results of the actual program I'm trying to write - everything looks washed out because my linear color values aren't being converted to sRGB before being written to the framebuffer. On the other hand, on my laptop (Windows 7 64-bit with an Intel graphics chip), the program prints 0 (huh?) and I get an sRGB default framebuffer by default whether I request one or not! And on both machines, if I manually create a non-default framebuffer bound to an sRGB texture, the program correctly prints 35904, a.k.a. gl_SRGB. Why am I getting different results on different hardware? Am I doing something wrong? How can I get an sRGB framebuffer consistently on all hardware and target OSes?

    Read the article

  • How can a computer render a CLI/console along with a GUI?

    - by Nathaniel Bennett
    I'm confused when looking into graphics - specifically with operating systems. I mean, how can a computer render a CLI/console along with a GUI? GUI's are completely different from text. And how can we have GUI windows that display text interfaces, ie how can we have CLI in modern Graphics Operating system - that's what I'm mainly trying to grip on to. How does graphics get rendered to display? Is there some sort of memory address that a GPU access which holds all pixel data, and there system's within OS's that gather the pixel position of windows and widgets, along with the Z Index and rasterize them to that memory address, which then the GPU loads to the screen? How about the CLI's integrated with Graphics? How does the OS tell the GPU that a certain part of the screen wants to display text while the rest wants to display pixel data?

    Read the article

  • Monitor not detected after booting without monitor attached (12.04)

    - by cawkie
    I had a stable 12.04 machine running perfectly. The machine was booted without the monitor connected - since then the system always boots to low graphics mode. Onboard graphics (from lspci): VGA compatible controller: Intel Corporation 4 Series Chipset Integrated Graphics Controller (rev 03) Monitor: AOC e2450Swh Display widget shows monitor as laptop(!?) and system details shows graphics as Gallium 0.4 on llvmpipe (LLVM 0x300) X-server log appears to show correct monitor detected. When I boot from a live CD I get full 3d graphics I've tried the monitor on a different machine - all OK. I've tried a different monitor on this machine - same problem. Between having a working system and a broken one there have been no updates and I have made no configuration changes... EDIT: I have come to the conclusion that the problem is caused by a known issue with lightDM hanging on battery check. I've managed to get 3D graphic working by switching to using GDM - not a solution but acceptable workaround. I would still like to know what is causing the problem and how I managed to get my system into this state!

    Read the article

  • Graphics glitch when drawing to a Cairo context obtained from a gtk.DrawingArea inside a gtk.Viewport.

    - by user410023
    I am trying to redraw the part of the DrawingArea that is visible in the Viewport in the expose-event handler. However, it seems that I am doing something wrong with the coordinates that are passed to the event handler because there is garbage at the edge of the Viewport when scrolling. Can anyone tell what I am doing wrong? Here is a small example: import pygtk pygtk.require("2.0") import gtk from numpy import array from math import pi class Circle(object): def init(self, position = [0., 0.], radius = 0., edge = (0., 0., 0.), fill = None): self.position = position self.radius = radius self.edge = edge self.fill = fill def draw(self, ctx): rect = array(ctx.clip_extents()) rect[2] -= rect[0] rect[3] -= rect[1] center = rect[2:4] / 2 ctx.arc(center[0], center[1], self.radius, 0., 2. * pi) if self.fill != None: ctx.set_source_rgb(*self.fill) ctx.fill_preserve() ctx.set_source_rgb(*self.edge) ctx.stroke() class Scene(object): class Proxy(object): directory = {} def init(self, target, layers = set()): self.target = target self.layers = layers Scene.Proxy.directory[target] = self def __init__(self, viewport): self.objects = {} self.layers = [set()] self.viewport = viewport self.signals = {} def draw(self, ctx): x = self.viewport.get_hadjustment().value y = self.viewport.get_vadjustment().value ctx.set_source_rgb(1., 1., 1.) ctx.paint() ctx.translate(x, y) for obj in self: obj.draw(ctx) def add(self, item, layer = 0): item = Scene.Proxy(item, layers = set((layer,))) assert(hasattr(item.target, "draw")) assert(isinstance(layer, int)) item.layers.add(layer) while not layer < len(self.layers): self.layers.append(set()) self.layers[layer].add(item) if not item in self.objects: self.objects[item] = set() self.objects[item].add(layer) def remove(self, item, layers = None): item = Scene.Proxy.directory[item] if layers == None: layers = self.objects[item] for layer in layers: layer.remove(item) item.layers.remove(layer) if len(item.layers) == 0: self.objects.remove(item) def __iter__(self): for layer in self.layers: for item in layer: yield item.target class App(object): def init(self): signals = { "canvas_exposed": self.update_canvas, "gtk_main_quit": gtk.main_quit } self.builder = gtk.Builder() self.builder.add_from_file("graphics_glitch.glade") self.window = self.builder.get_object("window") self.viewport = self.builder.get_object("viewport") self.canvas = self.builder.get_object("canvas") self.scene = Scene(self.viewport) signals.update(self.scene.signals) self.builder.connect_signals(signals) self.window.show() def update_canvas(self, widget, event): ctx = self.canvas.window.cairo_create() self.scene.draw(ctx) ctx.clip() if name == "main": app = App() scene = app.scene scene.add(Circle((0., 0.), 10.)) gtk.main() And the Glade file "graphics_glitch.glade": <?xml version="1.0"?> <interface> <requires lib="gtk+" version="2.16"/> <!-- interface-naming-policy project-wide --> <object class="GtkWindow" id="window"> <property name="width_request">200</property> <property name="height_request">200</property> <property name="visible">True</property> <signal name="destroy" handler="gtk_main_quit"/> <child> <object class="GtkScrolledWindow" id="scrolledwindow1"> <property name="visible">True</property> <property name="can_focus">True</property> <property name="hadjustment">h_adjust</property> <property name="vadjustment">v_adjust</property> <property name="hscrollbar_policy">automatic</property> <property name="vscrollbar_policy">automatic</property> <child> <object class="GtkViewport" id="viewport"> <property name="visible">True</property> <property name="resize_mode">queue</property> <child> <object class="GtkDrawingArea" id="canvas"> <property name="width_request">640</property> <property name="height_request">480</property> <property name="visible">True</property> <signal name="expose_event" handler="canvas_exposed"/> </object> </child> </object> </child> </object> </child> </object> <object class="GtkAdjustment" id="h_adjust"> <property name="lower">-1000</property> <property name="upper">1000</property> <property name="step_increment">1</property> <property name="page_increment">25</property> <property name="page_size">25</property> </object> <object class="GtkAdjustment" id="v_adjust"> <property name="lower">-1000</property> <property name="upper">1000</property> <property name="step_increment">1</property> <property name="page_increment">25</property> <property name="page_size">25</property> </object> </interface> Thanks! --Dan

    Read the article

  • Python Turtle Graphics, how to plot functions over an interval?

    - by TheDragonAce
    I need to plot a function over a specified interval. The function is f1, which is shown below in the code, and the interval is [-7, -3]; [-1, 1]; [3, 7] with a step of .01. When I execute the program, nothing is drawn. Any ideas? import turtle from math import sqrt wn = turtle.Screen() wn.bgcolor("white") wn.title("Plotting") mypen = turtle.Turtle() mypen.shape("classic") mypen.color("black") mypen.speed(10) while True: try: def f1(x): return 2 * sqrt((-abs(abs(x)-1)) * abs(3 - abs(x))/((abs(x)-1)*(3-abs(x)))) * \ (1 + abs(abs(x)-3)/(abs(x)-3))*sqrt(1-(x/7)**2)+(5+0.97*(abs(x-0.5)+abs(x+0.5))-\ 3*(abs(x-0.75)+abs(x+0.75)))*(1+abs(1-abs(x))/(1-abs(x))) mypen.penup() step=.01 startf11=-7 stopf11=-3 startf12=-1 stopf12=1 startf13=3 stopf13=7 def f11 (startf11,stopf11,step): rc=[] y = f1(startf11) while y<=stopf11: rc.append(startf11) #y+=step mypen.setpos(f1(startf11)*25,y*25) mypen.dot() def f12 (startf12,stopf12,step): rc=[] y = f1(startf12) while y<=stopf12: rc.append(startf12) #y+=step mypen.setpos(f1(startf12)*25, y*25) mypen.dot() def f13 (startf13,stopf13,step): rc=[] y = f1(startf13) while y<=stopf13: rc.append(startf13) #y+=step mypen.setpos(f1(startf13)*25, y*25) mypen.dot() f11(startf11,stopf11,step) f12(startf12,stopf12,step) f13(startf13,stopf13,step) except ZeroDivisionError: continue

    Read the article

  • Ubuntu 12.04 Overheating HP Pavillion dm4 3011tx

    - by gevvek
    I have tried installing Ubuntu 12.04 on my HP Pavilion dm4 3011tx and after a few minutes the fans start to work very fast and my laptop starts to heat up; The CPU temperature got up to 70 degrees and was still rising before I turned the computer off. I installed the graphics drivers for my AMD Raedon graphics and tried switching to the integrated graphics but that didn't make a difference I have also tried Fedora and Linux Mint and they do the same thing. Can anyone help?

    Read the article

  • blacklist VGA compatible controller: Intel Corporation 82845G/GL[Brookdale-G]/GE

    - by Thomas Labensi
    I have an hp a310n pavillion I have installed an nvidia pci geforce card I want to blacklist the VGA compa[Brookdale-G]/GE Chipset Integrated Graphics Device (rev 03)integrated graphics what do I need to do?? tom@tom-DM167A-ABA-a310n:~$ lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 82845G/GL[Brookdale-G]/GE Chipset Integrated Graphics Device (rev 03) 02:09.0 VGA compatible controller: NVIDIA Corporation NV11 [GeForce2 MX/MX 400] (rev b2) tom@tom-DM167A-ABA-a310n:~$ I'm using the nvidia via neuvoux and I want to really make sure I'm using the nvidia card

    Read the article

  • Thinkpad T530 with Optimus and Docking Station

    - by Vic Boudolf
    I have a Lenovo Thinkpad T530 with Optimus video, which is not supported on 12.04.1. I don't normally need the discrete (nVidia) graphics, so I turn it off in the BIOS settings to achieve longer battery life (and so that the screen dimmer will work), but when placed in the docking station, the integrated (Intel) graphics don't power the HDMI ports. (The VGA port does work, but I want to focus on the HDMI.) This means I have to change the BIOS settings constantly. Is there any way to have the system detect the docking station and power up/enable the discrete graphics accordingly? I don't need to do it on the fly. Just at startup. This post suggests that bumblebee can turn the discrete graphics on and off for specific applications, but I just want to turn it on or off. [2 suggests that vga_switcheroo will not work with nVidia Optimus.

    Read the article

  • OpenGL ES 2.0: Mixing 2D with 3D

    - by Bunkai.Satori
    Is it possible to mix 2D and 3D graphics in a single OpenGL ES 2.0 game, please? I have plenty of 2D graphics in my game. The 2D graphics is represented by two triangular polygons (making up a rectangle) with texture on them. I use orthographic matrix to render the whole scene. However, I need to add some 3D effects into my game. Threfore, I wish to use perspective camera to render the meshes. Is it possible to mix orthographic and perspective camera in one scene? If yes, is there going to be a large performance cost for this? Is there any recommended approach to do this effectively? I wil have 90% of 2D graphics and only 10% of 3D. Target platform is OpenGL ES 2.0 (iOS, Android). I use C++ to develop. Thank you.

    Read the article

  • Ignore carriage returns in scanf before data.... to keep layout of console based graphics with conio

    - by volting
    I have the misfortune of having use conio.h in vc++ 6 for a college assignment, My problem is that my graphic setup is in the center of the screen... e.g. gotoxy( getcols()/2, getrows()/2); printf("Enter something"); scanf( "%d", &something ); now if someone accidentally hits enter before they enter the "something", then the cursor gets reset to the left of the screen on the next line. Iv tried flushing the keyboard and bios buffers with fflush(stdin) and getchar(), which like I expected didn't work! Any help/ideas would be appreciated, Thanks, V

    Read the article

  • Fallback Mode on Intel HD 4000 on Ubuntu 12.04.1?

    - by caragh
    Just built a system w/ a ivy bridge CPU (Xeon E3-1245 v2) with Intel HD 4000 onboard graphics, board is an Asrock H77 ProM. I had loaded Ubuntu server 12.04.1 onto it, but wanted to fool around w/ gnome 3. I installed gnome-shell, which didn't work, then gnome, which did, but only loads on fallback mode - the video is recognized as "VESA: sandy/ivy bridge graphics" I tried installing the whole ubuntu-desktop shebang but it's still in fallback graphics. Any way to get the full eye candy?

    Read the article

< Previous Page | 82 83 84 85 86 87 88 89 90 91 92 93  | Next Page >