Search Results

Search found 5873 results on 235 pages for 'raster graphics'.

Page 87/235 | < Previous Page | 83 84 85 86 87 88 89 90 91 92 93 94  | Next Page >

  • effect and model vertex declaration compatibility

    - by Vodácek
    I have normal model drawing code. When I try to draw model without UV coordinates I got this exception: System.InvalidOperationException: The current vertex declaration does not include all the elements required by the current vertex shader. TextureCoordinate0 is missing. at Microsoft.Xna.Framework.Graphics.GraphicsDevice.VerifyCanDraw( Boolean bUserPrimitives, Boolean bIndexedPrimitives) at Microsoft.Xna.Framework.Graphics.GraphicsDevice.DrawIndexedPrimitives( PrimitiveType primitiveType, Int32 baseVertex, Int32 minVertexIndex, Int32 numVertices, Int32 startIndex, Int32 primitiveCount) at Microsoft.Xna.Framework.Graphics.ModelMeshPart.Draw() at Microsoft.Xna.Framework.Graphics.ModelMesh.Draw() ... I know what cause the exception, but is possible to avoid it? Is possible to check model before drawing it with current shader for vertex declaration compatibility?

    Read the article

  • GameplayScreen does not contain a definition for GraphicsDevice

    - by Dave Voyles
    Long story short: I'm trying to intergrate my game with Microsoft's Game State Management. In doing so I've run into some errors, and the latest one is in the title. I'm not able to display my HUD for the reasons listed above. Previously, I had much of my code in my Game.cs class, but the GSM has a bit of it in Game1, and most of what you have drawn for the main screen in your GameplayScreen class, and that is what is causing confusion on my part. I've created an instance of the GameplayScreen class to be used in the HUD class (as you can see below). Before integrating with the GSM however, I created an instance of my Game class, and all worked fine. It seems that I need to define my graphics device somewhere, but I am not sure of where exactly. I've left some code below to help you understand. public class GameStateManagementGame : Microsoft.Xna.Framework.Game { #region Fields GraphicsDeviceManager graphics; ScreenManager screenManager; // Creates a new intance, which is used in the HUD class public static Game Instance; // By preloading any assets used by UI rendering, we avoid framerate glitches // when they suddenly need to be loaded in the middle of a menu transition. static readonly string[] preloadAssets = { "gradient", }; #endregion #region Initialization /// <summary> /// The main game constructor. /// </summary> public GameStateManagementGame() { Content.RootDirectory = "Content"; graphics = new GraphicsDeviceManager(this); graphics.PreferredBackBufferWidth = 1280; graphics.PreferredBackBufferHeight = 720; graphics.IsFullScreen = false; graphics.ApplyChanges(); // Create the screen manager component. screenManager = new ScreenManager(this); Components.Add(screenManager); // Activate the first screens. screenManager.AddScreen(new BackgroundScreen(), null); //screenManager.AddScreen(new MainMenuScreen(), null); screenManager.AddScreen(new PressStartScreen(), null); } namespace Pong { public class HUD { public void Update(GameTime gameTime) { // Used in the Draw method titleSafeRectangle = new Rectangle (GameplayScreen.Instance.GraphicsDevice.Viewport.TitleSafeArea.X, GameplayScreen.Instance.GraphicsDevice.Viewport.TitleSafeArea.Y, GameplayScreen.Instance.GraphicsDevice.Viewport.TitleSafeArea.Width, GameplayScreen.Instance.GraphicsDevice.Viewport.TitleSafeArea.Height); } } } class GameplayScreen : GameScreen { #region Fields ContentManager content; public static GameStates gamestate; private GraphicsDeviceManager graphics; public int screenWidth; public int screenHeight; private Texture2D backgroundTexture; private SpriteBatch spriteBatch; private Menu menu; private SpriteFont arial; private HUD hud; Animation player; // Creates a new intance, which is used in the HUD class public static GameplayScreen Instance; public GameplayScreen() { TransitionOnTime = TimeSpan.FromSeconds(1.5); TransitionOffTime = TimeSpan.FromSeconds(0.5); } protected void Initialize() { lastScored = false; menu = new Menu(); resetTimer = 0; resetTimerInUse = true; ball = new Ball(content, new Vector2(screenWidth, screenHeight)); SetUpMulti(); input = new Input(); hud = new HUD(); // Places the powerup animation inside of the surrounding box // Needs to be cleaned up, instead of using hard pixel values player = new Animation(content.Load<Texture2D>(@"gfx/powerupSpriteSheet"), new Vector2(103, 44), 64, 64, 4, 5); // Used by for the Powerups random = new Random(); vec = new Vector2(100, 50); vec2 = new Vector2(100, 100); promptVec = new Vector2(50, 25); timer = 10000.0f; // Starting value for the cooldown for the powerup timer timerVector = new Vector2(10, 10); //JEP - one time creation of powerup objects playerOnePowerup = new Powerup(); playerOnePowerup.Activated += PowerupActivated; playerOnePowerup.Deactivated += PowerupDeactivated; playerTwoPowerup = new Powerup(); playerTwoPowerup.Activated += PowerupActivated; playerTwoPowerup.Deactivated += PowerupDeactivated; //JEP - moved from events since these only need set once activatedVec = new Vector2(100, 125); deactivatedVec = new Vector2(100, 150); powerupReady = false; }

    Read the article

  • Why my new GTX 660m's clock drops drastically after running few seconds

    - by trVoldemort
    I bought a Lenovo Y580 laptop few days ago, this model is equipped with GTX 660m graphics card. However, the game performance is unbelievably poor since it out from the box. I realized there is something wrong with this graphics card. I downloaded GPU-z, and did a simple test. And I was shocked by the fact that my GTX 660m graphics card is running at 135.0mhz core clock. (It should be 835mhz at least!) Even the integrated graphics card "Intel HD graphics 4000" can run at 650mhz. Further examining showed that in the first few seconds GTX 660m was actually running at 835mhz, however the core temperature quickly reached 90+°C and the clock (maybe) automatically drop to 135.0mhz. This is very strange. Anyone has any idea what's going on here?

    Read the article

  • Why am I not getting an sRGB default framebuffer?

    - by Aaron Rotenberg
    I'm trying to make my OpenGL Haskell program gamma correct by making appropriate use of sRGB framebuffers and textures, but I'm running into issues making the default framebuffer sRGB. Consider the following Haskell program, compiled for 32-bit Windows using GHC and linked against 32-bit freeglut: import Foreign.Marshal.Alloc(alloca) import Foreign.Ptr(Ptr) import Foreign.Storable(Storable, peek) import Graphics.Rendering.OpenGL.Raw import qualified Graphics.UI.GLUT as GLUT import Graphics.UI.GLUT(($=)) main :: IO () main = do (_progName, _args) <- GLUT.getArgsAndInitialize GLUT.initialDisplayMode $= [GLUT.SRGBMode] _window <- GLUT.createWindow "sRGB Test" -- To prove that I actually have freeglut working correctly. -- This will fail at runtime under classic GLUT. GLUT.closeCallback $= Just (return ()) glEnable gl_FRAMEBUFFER_SRGB colorEncoding <- allocaOut $ glGetFramebufferAttachmentParameteriv gl_FRAMEBUFFER gl_FRONT_LEFT gl_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING print colorEncoding allocaOut :: Storable a => (Ptr a -> IO b) -> IO a allocaOut f = alloca $ \ptr -> do f ptr peek ptr On my desktop (Windows 8 64-bit with a GeForce GTX 760 graphics card) this program outputs 9729, a.k.a. gl_LINEAR, indicating that the default framebuffer is using linear color space, even though I explicitly requested an sRGB window. This is reflected in the rendering results of the actual program I'm trying to write - everything looks washed out because my linear color values aren't being converted to sRGB before being written to the framebuffer. On the other hand, on my laptop (Windows 7 64-bit with an Intel graphics chip), the program prints 0 (huh?) and I get an sRGB default framebuffer by default whether I request one or not! And on both machines, if I manually create a non-default framebuffer bound to an sRGB texture, the program correctly prints 35904, a.k.a. gl_SRGB. Why am I getting different results on different hardware? Am I doing something wrong? How can I get an sRGB framebuffer consistently on all hardware and target OSes?

    Read the article

  • How can a computer render a CLI/console along with a GUI?

    - by Nathaniel Bennett
    I'm confused when looking into graphics - specifically with operating systems. I mean, how can a computer render a CLI/console along with a GUI? GUI's are completely different from text. And how can we have GUI windows that display text interfaces, ie how can we have CLI in modern Graphics Operating system - that's what I'm mainly trying to grip on to. How does graphics get rendered to display? Is there some sort of memory address that a GPU access which holds all pixel data, and there system's within OS's that gather the pixel position of windows and widgets, along with the Z Index and rasterize them to that memory address, which then the GPU loads to the screen? How about the CLI's integrated with Graphics? How does the OS tell the GPU that a certain part of the screen wants to display text while the rest wants to display pixel data?

    Read the article

  • Monitor not detected after booting without monitor attached (12.04)

    - by cawkie
    I had a stable 12.04 machine running perfectly. The machine was booted without the monitor connected - since then the system always boots to low graphics mode. Onboard graphics (from lspci): VGA compatible controller: Intel Corporation 4 Series Chipset Integrated Graphics Controller (rev 03) Monitor: AOC e2450Swh Display widget shows monitor as laptop(!?) and system details shows graphics as Gallium 0.4 on llvmpipe (LLVM 0x300) X-server log appears to show correct monitor detected. When I boot from a live CD I get full 3d graphics I've tried the monitor on a different machine - all OK. I've tried a different monitor on this machine - same problem. Between having a working system and a broken one there have been no updates and I have made no configuration changes... EDIT: I have come to the conclusion that the problem is caused by a known issue with lightDM hanging on battery check. I've managed to get 3D graphic working by switching to using GDM - not a solution but acceptable workaround. I would still like to know what is causing the problem and how I managed to get my system into this state!

    Read the article

  • Graphics glitch when drawing to a Cairo context obtained from a gtk.DrawingArea inside a gtk.Viewport.

    - by user410023
    I am trying to redraw the part of the DrawingArea that is visible in the Viewport in the expose-event handler. However, it seems that I am doing something wrong with the coordinates that are passed to the event handler because there is garbage at the edge of the Viewport when scrolling. Can anyone tell what I am doing wrong? Here is a small example: import pygtk pygtk.require("2.0") import gtk from numpy import array from math import pi class Circle(object): def init(self, position = [0., 0.], radius = 0., edge = (0., 0., 0.), fill = None): self.position = position self.radius = radius self.edge = edge self.fill = fill def draw(self, ctx): rect = array(ctx.clip_extents()) rect[2] -= rect[0] rect[3] -= rect[1] center = rect[2:4] / 2 ctx.arc(center[0], center[1], self.radius, 0., 2. * pi) if self.fill != None: ctx.set_source_rgb(*self.fill) ctx.fill_preserve() ctx.set_source_rgb(*self.edge) ctx.stroke() class Scene(object): class Proxy(object): directory = {} def init(self, target, layers = set()): self.target = target self.layers = layers Scene.Proxy.directory[target] = self def __init__(self, viewport): self.objects = {} self.layers = [set()] self.viewport = viewport self.signals = {} def draw(self, ctx): x = self.viewport.get_hadjustment().value y = self.viewport.get_vadjustment().value ctx.set_source_rgb(1., 1., 1.) ctx.paint() ctx.translate(x, y) for obj in self: obj.draw(ctx) def add(self, item, layer = 0): item = Scene.Proxy(item, layers = set((layer,))) assert(hasattr(item.target, "draw")) assert(isinstance(layer, int)) item.layers.add(layer) while not layer < len(self.layers): self.layers.append(set()) self.layers[layer].add(item) if not item in self.objects: self.objects[item] = set() self.objects[item].add(layer) def remove(self, item, layers = None): item = Scene.Proxy.directory[item] if layers == None: layers = self.objects[item] for layer in layers: layer.remove(item) item.layers.remove(layer) if len(item.layers) == 0: self.objects.remove(item) def __iter__(self): for layer in self.layers: for item in layer: yield item.target class App(object): def init(self): signals = { "canvas_exposed": self.update_canvas, "gtk_main_quit": gtk.main_quit } self.builder = gtk.Builder() self.builder.add_from_file("graphics_glitch.glade") self.window = self.builder.get_object("window") self.viewport = self.builder.get_object("viewport") self.canvas = self.builder.get_object("canvas") self.scene = Scene(self.viewport) signals.update(self.scene.signals) self.builder.connect_signals(signals) self.window.show() def update_canvas(self, widget, event): ctx = self.canvas.window.cairo_create() self.scene.draw(ctx) ctx.clip() if name == "main": app = App() scene = app.scene scene.add(Circle((0., 0.), 10.)) gtk.main() And the Glade file "graphics_glitch.glade": <?xml version="1.0"?> <interface> <requires lib="gtk+" version="2.16"/> <!-- interface-naming-policy project-wide --> <object class="GtkWindow" id="window"> <property name="width_request">200</property> <property name="height_request">200</property> <property name="visible">True</property> <signal name="destroy" handler="gtk_main_quit"/> <child> <object class="GtkScrolledWindow" id="scrolledwindow1"> <property name="visible">True</property> <property name="can_focus">True</property> <property name="hadjustment">h_adjust</property> <property name="vadjustment">v_adjust</property> <property name="hscrollbar_policy">automatic</property> <property name="vscrollbar_policy">automatic</property> <child> <object class="GtkViewport" id="viewport"> <property name="visible">True</property> <property name="resize_mode">queue</property> <child> <object class="GtkDrawingArea" id="canvas"> <property name="width_request">640</property> <property name="height_request">480</property> <property name="visible">True</property> <signal name="expose_event" handler="canvas_exposed"/> </object> </child> </object> </child> </object> </child> </object> <object class="GtkAdjustment" id="h_adjust"> <property name="lower">-1000</property> <property name="upper">1000</property> <property name="step_increment">1</property> <property name="page_increment">25</property> <property name="page_size">25</property> </object> <object class="GtkAdjustment" id="v_adjust"> <property name="lower">-1000</property> <property name="upper">1000</property> <property name="step_increment">1</property> <property name="page_increment">25</property> <property name="page_size">25</property> </object> </interface> Thanks! --Dan

    Read the article

  • Python Turtle Graphics, how to plot functions over an interval?

    - by TheDragonAce
    I need to plot a function over a specified interval. The function is f1, which is shown below in the code, and the interval is [-7, -3]; [-1, 1]; [3, 7] with a step of .01. When I execute the program, nothing is drawn. Any ideas? import turtle from math import sqrt wn = turtle.Screen() wn.bgcolor("white") wn.title("Plotting") mypen = turtle.Turtle() mypen.shape("classic") mypen.color("black") mypen.speed(10) while True: try: def f1(x): return 2 * sqrt((-abs(abs(x)-1)) * abs(3 - abs(x))/((abs(x)-1)*(3-abs(x)))) * \ (1 + abs(abs(x)-3)/(abs(x)-3))*sqrt(1-(x/7)**2)+(5+0.97*(abs(x-0.5)+abs(x+0.5))-\ 3*(abs(x-0.75)+abs(x+0.75)))*(1+abs(1-abs(x))/(1-abs(x))) mypen.penup() step=.01 startf11=-7 stopf11=-3 startf12=-1 stopf12=1 startf13=3 stopf13=7 def f11 (startf11,stopf11,step): rc=[] y = f1(startf11) while y<=stopf11: rc.append(startf11) #y+=step mypen.setpos(f1(startf11)*25,y*25) mypen.dot() def f12 (startf12,stopf12,step): rc=[] y = f1(startf12) while y<=stopf12: rc.append(startf12) #y+=step mypen.setpos(f1(startf12)*25, y*25) mypen.dot() def f13 (startf13,stopf13,step): rc=[] y = f1(startf13) while y<=stopf13: rc.append(startf13) #y+=step mypen.setpos(f1(startf13)*25, y*25) mypen.dot() f11(startf11,stopf11,step) f12(startf12,stopf12,step) f13(startf13,stopf13,step) except ZeroDivisionError: continue

    Read the article

  • Ubuntu 12.04 Overheating HP Pavillion dm4 3011tx

    - by gevvek
    I have tried installing Ubuntu 12.04 on my HP Pavilion dm4 3011tx and after a few minutes the fans start to work very fast and my laptop starts to heat up; The CPU temperature got up to 70 degrees and was still rising before I turned the computer off. I installed the graphics drivers for my AMD Raedon graphics and tried switching to the integrated graphics but that didn't make a difference I have also tried Fedora and Linux Mint and they do the same thing. Can anyone help?

    Read the article

  • blacklist VGA compatible controller: Intel Corporation 82845G/GL[Brookdale-G]/GE

    - by Thomas Labensi
    I have an hp a310n pavillion I have installed an nvidia pci geforce card I want to blacklist the VGA compa[Brookdale-G]/GE Chipset Integrated Graphics Device (rev 03)integrated graphics what do I need to do?? tom@tom-DM167A-ABA-a310n:~$ lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 82845G/GL[Brookdale-G]/GE Chipset Integrated Graphics Device (rev 03) 02:09.0 VGA compatible controller: NVIDIA Corporation NV11 [GeForce2 MX/MX 400] (rev b2) tom@tom-DM167A-ABA-a310n:~$ I'm using the nvidia via neuvoux and I want to really make sure I'm using the nvidia card

    Read the article

  • Thinkpad T530 with Optimus and Docking Station

    - by Vic Boudolf
    I have a Lenovo Thinkpad T530 with Optimus video, which is not supported on 12.04.1. I don't normally need the discrete (nVidia) graphics, so I turn it off in the BIOS settings to achieve longer battery life (and so that the screen dimmer will work), but when placed in the docking station, the integrated (Intel) graphics don't power the HDMI ports. (The VGA port does work, but I want to focus on the HDMI.) This means I have to change the BIOS settings constantly. Is there any way to have the system detect the docking station and power up/enable the discrete graphics accordingly? I don't need to do it on the fly. Just at startup. This post suggests that bumblebee can turn the discrete graphics on and off for specific applications, but I just want to turn it on or off. [2 suggests that vga_switcheroo will not work with nVidia Optimus.

    Read the article

  • OpenGL ES 2.0: Mixing 2D with 3D

    - by Bunkai.Satori
    Is it possible to mix 2D and 3D graphics in a single OpenGL ES 2.0 game, please? I have plenty of 2D graphics in my game. The 2D graphics is represented by two triangular polygons (making up a rectangle) with texture on them. I use orthographic matrix to render the whole scene. However, I need to add some 3D effects into my game. Threfore, I wish to use perspective camera to render the meshes. Is it possible to mix orthographic and perspective camera in one scene? If yes, is there going to be a large performance cost for this? Is there any recommended approach to do this effectively? I wil have 90% of 2D graphics and only 10% of 3D. Target platform is OpenGL ES 2.0 (iOS, Android). I use C++ to develop. Thank you.

    Read the article

  • Ignore carriage returns in scanf before data.... to keep layout of console based graphics with conio

    - by volting
    I have the misfortune of having use conio.h in vc++ 6 for a college assignment, My problem is that my graphic setup is in the center of the screen... e.g. gotoxy( getcols()/2, getrows()/2); printf("Enter something"); scanf( "%d", &something ); now if someone accidentally hits enter before they enter the "something", then the cursor gets reset to the left of the screen on the next line. Iv tried flushing the keyboard and bios buffers with fflush(stdin) and getchar(), which like I expected didn't work! Any help/ideas would be appreciated, Thanks, V

    Read the article

  • Fallback Mode on Intel HD 4000 on Ubuntu 12.04.1?

    - by caragh
    Just built a system w/ a ivy bridge CPU (Xeon E3-1245 v2) with Intel HD 4000 onboard graphics, board is an Asrock H77 ProM. I had loaded Ubuntu server 12.04.1 onto it, but wanted to fool around w/ gnome 3. I installed gnome-shell, which didn't work, then gnome, which did, but only loads on fallback mode - the video is recognized as "VESA: sandy/ivy bridge graphics" I tried installing the whole ubuntu-desktop shebang but it's still in fallback graphics. Any way to get the full eye candy?

    Read the article

  • Ubuntu make symbolic link between new folder in Home to existing folder

    - by Fath
    Hello, To the point. I have Ubuntu Maverick running on my Lenovo G450. Before, it was Windows 7. All my data are inside another partition, its NTFS. FSTAB line to mount that partition : /dev/sda5 /data ntfs auto,users,uid=1000,gid=1000,utf8,dmask=027,fmask=137 0 0 Inside /data there are folder Musics, Graphics, Tools, Cores, etc. If I'm about to create new folder, let see, GFX on /home/apronouva/GFX and make it link or pointing to /data/Graphics, how do I do that ? So when I open /home/apronouva/GFX the content will be the same as inside /data/Graphics .. and whatever changes I made inside GFX, it will also affect /data/Graphics I tried : $ ln -s /data/Graphics /home/apronouva/GFX it resulted : error, cannot make symbolic link between folder Thanks in advance, Fath

    Read the article

  • Dual Monitor 'How To' for 12.04

    - by Kim Prince
    I recently built my own PC and was delighted with the result, except for a problem with dual monitors. After having tried a few different combinations of hardware, I think what I really need is a 'how to' explanation. My motherboard is an MSI Z77MA-G45, which has an analogue, a DVI, and a HDMI port. Initially I hooked monitors up to the DVI and analogue port and it seemed to work fine. Both screens worked independently of each other, it was great. After a few days I started turning my PC off at night, and when I tried to turn it back on it would boot into the terminal mode. I would have to turn one of the monitors off and after rebooting a few times, it would eventually boot into an X Window session. Occasionally I would see an error relating to Xorg. I upgraded the motherboard BIOS but that made no difference. Eventually, I installed a graphics card - an NVIDIA GeForce GT 520. Now it seems that my on board graphics have been disabled completely, and I am reliant on the graphics card. Furthermore, the graphics card seems to only recognise one screen at a time. (The first time I rebooted with both plugged in, it flashed up a message saying that it was auto-selecting DVI). Anyhow, I think I need some 'how to' (or perhaps 'where to'), from here. For example, is X Windows configuration the next place to look? And how do I go about configuring X Windows? (Note that in Systems Settings it says my graphics driver is 'unknown', and when I ask it to detect monitors, it sees only the one!)

    Read the article

  • Problems when rendering code on Nvidia GPU

    - by 2am
    I am following OpenGL GLSL cookbook 4.0, I have rendered a tesselated quad, as you see in the screenshot below, and i am moving Y coordinate of every vertex using a time based sin function as given in the code in the book. This program, as you see on the text in the image, runs perfectly on built in Intel HD graphics of my processor, but i have Nvidia GT 555m graphics in my laptop, (which by the way has switchable graphics) when I run the program on the graphic card, the OpenGL shader compilation fails. It fails on following instruction.. pos.y = sin.waveAmp * sin(u); giving error Error C1105 : Cannot call a non-function I know this error is coming on the sin(u) function which you see in the instruction. I am not able to understand why? When i removed sin(u) from the code, the program ran fine on Nvidia card. Its running with sin(u) fine on Intel HD 3000 graphics. Also, if you notice the program is almost unusable with intel HD 3000 graphics, I am getting only 9FPS, which is not enough. Its too much load for intel HD 3000. So, sin(X) function is not defined in the OpenGL specification given by Nvidia drivers or something else??

    Read the article

  • Which nvidia driver from additional drivers option should I choose?

    - by sud_the_devil
    I have NVIDIA Geforece 7025 / nForce 630a integrated gfx card on my UBUNTU 12.04.And I have 4 options in Additional drivers: NVIDIA accelerated graphics driver (version 173) NVIDIA accelerated graphics driver (post-release updates)(version current updates) NVIDIA accelerated graphics driver (version current)[Recommended] NVIDIA accelerated graphics driver (post-release updates)(version 173 updates) I am currently using the First option. My Problem is there I can't update the version 173 to the latest version of drivers using xswat PPA. Whenever I do it and reboot but the NVIDIA xserver sttings always shows me the same version. Which option would allow me to update my drivers to the current version?

    Read the article

  • ActionScript Gradient Banding Problem

    - by TheDarkIn1978
    i'm having a strange issue with banding between certain colors of a gradient. to create the gradient, i'm drawing evenly spaced circle wedges from the center to the border, and filling each circle wedge from a bitmap line gradient pixel in a loop. public class ColorWheel extends Sprite { private static const DEFAULT_RADIUS:Number = 100; private static const DEFAULT_BANDING_QUALITY:int = 3600; public function ColorWheel(nRadius:Number = DEFAULT_RADIUS) { init(nRadius); } public function init(nRadius:Number = DEFAULT_RADIUS):void { var nRadians : Number; var nColor : Number; var objMatrix : Matrix = new Matrix(); var nX : Number; var nY : Number; var previousX : Number = nRadius; var previousY : Number = 0; var leftToRightColors:Array = new Array(0xFF0000, 0xFFFF00, 0x00FF00, 0x00FFFF, 0x0000FF, 0xFF00FF); leftToRightColors.push(leftToRightColors[0]); var leftToRightAlphas:Array = new Array(); var leftToRightRatios:Array = new Array(); var leftToRightPartition:Number = 255 / (leftToRightColors.length - 1); //Push arrays for (var j:int = 0; j < leftToRightColors.length; j++) { leftToRightAlphas.push(1); leftToRightRatios.push(j * leftToRightPartition); } var leftToRightColorsMatrix:Matrix = new Matrix(); leftToRightColorsMatrix.createGradientBox(DEFAULT_BANDING_QUALITY, 1); //Produce a horizontal leftToRightLine sprite var leftToRightLine:Sprite = new Sprite(); leftToRightLine.graphics.lineStyle(1, 0, 1, false, LineScaleMode.NONE, CapsStyle.NONE); leftToRightLine.graphics.lineGradientStyle(GradientType.LINEAR, leftToRightColors, leftToRightAlphas, leftToRightRatios, leftToRightColorsMatrix); leftToRightLine.graphics.moveTo(0, 0); leftToRightLine.graphics.lineTo(DEFAULT_BANDING_QUALITY, 0); //Assign bitmapData to the leftToRightLine var leftToRightLineBitmapData:BitmapData = new BitmapData(leftToRightLine.width, leftToRightLine.height); leftToRightLineBitmapData.draw(leftToRightLine); for(var i:int = 1; i < (DEFAULT_BANDING_QUALITY + 1); i++) { // Convert the degree to radians. nRadians = i * (Math.PI / (DEFAULT_BANDING_QUALITY / 2)); // OR the individual color channels together. nColor = leftToRightLineBitmapData.getPixel(i-1, 0); // Calculate the coordinate in which the line should be drawn to. nX = nRadius * Math.cos(nRadians); nY = nRadius * Math.sin(nRadians); // Create a matrix for the wedges gradient color. objMatrix.createGradientBox(nRadius * 2, nRadius * 2, nRadians, -nRadius, -nRadius); graphics.beginGradientFill(GradientType.LINEAR, [nColor, nColor], [1, 1], [127, 255], objMatrix); graphics.moveTo( 0, 0 ); graphics.lineTo( previousX, previousY ); graphics.lineTo( nX, nY ); graphics.lineTo( 0, 0 ); graphics.endFill(); previousX = nX; previousY = nY; } } } i'm creating a circle with 3600 wedges, although it doesn't look like it based on the screen shot within the orange color that is produced from gradating from red to yellow numbers. adding a orange number between red and yellow doesn't help. but if i create the circle with only 360 wedges, the gradient banding is much more obvious. 3600 is probably overkill, and doesn't really add more detail over, say, making the circle of 1440 wedges, but i don't know any other way to slightly elevate this banding issue. any ideas how i can fix this, or what i'm doing wrong? could it be caused by the circleMatrix rotation?

    Read the article

  • fast java2d translucency

    - by mdriesen
    I'm trying to draw a bunch of translucent circles on a Swing JComponent. This isn't exactly fast, and I was wondering if there is a way to speed it up. My custom JComponent has the following paintComponent method: public void paintComponent(Graphics g) { Rectangle view = g.getClipBounds(); VolatileImage image = createVolatileImage(view.width, view.height); Graphics2D buffer = image.createGraphics(); // translate to camera location buffer.translate(-cx, -cy); // renderables contains all currently visible objects for(Renderable r : renderables) { r.paint(buffer); } g.drawImage(image.getSnapshot(), view.x, view.y, this); } The paint method of my circles is as follows: public void paint(Graphics2D graphics) { graphics.setPaint(paint); graphics.fillOval(x, y, radius, radius); } The paint is just an rgba color with a < 255: Color(int r, int g, int b, int a) It works fast enough for opaque objects, but is there a simple way to speed this up for translucent ones?

    Read the article

  • Will the MacBook Pro Early 2011 work better with Ubuntu than the Air Late 2010?

    - by AllanCaeg
    I got this late 2010 11" MacBook Air. I'm having issues with NVIDIA graphics, especially with GNOME Shell. I'm thinking about selling this to switch to the new MacBook Pro, particularly the entry level 13" (see specs here), because of the Intel HD Graphics 3000. I assume that it will be more FOSS-friendly. I just want to point out that there are non-negotiable reasons why I'm keeping an Apple hardware at the moment, so let's keep this on topic. Will the MacBook Pro Early 2011 work better with Ubuntu than the Air Late 2010? Any other factors than the graphics hardware? Should we expect better NVIDIA graphics anytime soon?

    Read the article

< Previous Page | 83 84 85 86 87 88 89 90 91 92 93 94  | Next Page >