Search Results

Search found 5806 results on 233 pages for 'graphics'.

Page 41/233 | < Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >

  • Why would video stutter on HDMI but not on DVI?

    - by CorvT
    I've got a system running Ubuntu 12.04 with an i3 2120T CPU/GPU. When I play video through mplayer, I notice when I'm hooked up to a screen via HDMI there is a small stutter (1-2 frames) every few seconds. I don't see this happening when I connect via DVI on the same screen. Resolution and refresh rate are same for both HDMI and DVI, so I'm not sure where else the problem could be coming from. I've also tried two different screens, and different cables. I see the stutter with either HDMI-HDMI cables, or DVI-HDMI cable with DVI from the PC and HDMI into the screen. I don't see the stutter with DVI-DVI cables, or when I use HDMI-DVI cables with HDMI from the PC to DVI into the screen. I've also tried using an AMD 5XXX series card with the open source radeon driver, and saw the same problem. I then tried an nVidia GeForce 210 card with the closed source driver, and the stutter went away. To me this smells like a driver/mesa/glx issue (since the problem went away with the nvidia card/driver), but I have no idea how to track this down.

    Read the article

  • Shader inputs in a general purpouse engine

    - by dreta
    I'm not familiar with SDKs like Unity or UDK that much, so i can't check this off hand. Do general purpouse engines allow users to create custom uniform variables? The way i see it, and the way i have implemented it in an engine i'm writing to learn 3D, is that there is a "set" of uniforms provided by the engine and if you want to write a custom shader then you utilize uniforms you need to create a wanted effect. Now, the thing is, first of all i'm not an artist, second of all, i didn't have a chance to create complex scenes yet. So my question is, is it common practice to define variables that the engine provides and only allow the user to work with what they're given? Allowing users to add custom programs and use them where they want is not hard, but i have issues imagining how you'd go about doing the same for uniforms.

    Read the article

  • How to make image bigger than the screen to be slideable in the screen in monogame for windows phone 8?

    - by Moses Aprico
    (Idk if my title is correct, because when I google it, there is no related result I guess) I am not sure how to explain it correctly, but I am making a plain 2D, tile based, tactic game in windows phone 8 using monogame. I want to make my map is "slideable". With "slidable" I mean I can draw larger images (in total) than my screen and then slide it so I can view a certain area of the drawn images Example : I have a screen which dimension is 1280x720. I have a 1500x1500px image, which consists of 15 tiles, which is 100x100px each, which each tiles is redrawn each times the "Draw" is called. If the image is larger than the screen, the displayed area will be trimmed and of course, making a 220x780px area that is unseenable. The only way to see all of it is through "sliding" the screen around, so I can see all the area. My question is : How to make that happen? Because in default, the screen is unslideable and the image remains trimmed. Sorry if my question and explanation is not clear enough. Clarify it as much as you like. Thank you.

    Read the article

  • About floating point precision and why do we still use it

    - by system_is_b0rken
    Floating point has always been troublesome for precision on large worlds. This article explains behind-the-scenes and offers the obvious alternative - fixed point numbers. Some facts are really impressive, like: "Well 64 bits of precision gets you to the furthest distance of Pluto from the Sun (7.4 billion km) with sub-micrometer precision. " Well sub-micrometer precision is more than any fps needs (for positions and even velocities), and it would enable you to build really big worlds. My question is, why do we still use floating point if fixed point has such advantages? Most rendering APIs and physics libraries use floating point (and suffer it's disadvantages, so developers need to get around them). Are they so much slower? Additionally, how do you think scalable planetary engines like outerra or infinity handle the large scale? Do they use fixed point for positions or do they have some space dividing algorithm?

    Read the article

  • Why doesn't my graphic card support 1280*1024?

    - by Allwar
    Hi, I have an external monitor which is an 20" 1280*1024. In windows 7 it works fine with that resolution but in ubuntu it can't. Example: In windows I connect it and activates it, done. In ubuntu I connect and the only resolution that is available is the ones my laptop screen support, 12" 1366*768. My laptop is an asus 1201n. If I force it to use 1280*1024 both screen crashes and i have to force a reboot. what should I do? alvar@alvars-laptop:~$ disper -l display DFP-0: HSD121PHW1 resolutions: 320x175, 320x200, 360x200, 320x240, 400x300, 416x312, 512x384, 640x350, 576x432, 640x400, 680x384, 720x400, 640x480, 720x450, 640x512, 700x525, 800x512, 840x525, 800x600, 960x540, 832x624, 1024x768, 1366x768 display CRT-0: CRT-0 resolutions: 320x240, 400x300, 512x384, 680x384, 640x480, 800x600, 1024x768, 1152x864, 1360x768

    Read the article

  • Finding graphical help? [duplicate]

    - by Jim Hurley
    This question already has an answer here: Where can I go to find a game graphic artist? [on hold] 4 answers If I am making an amateur video game team to design and produce a project, where would I go to find someone to make 3D models? So far, I have my own story design and programming, someone doing engine programming, someone doing the sound, and someone possibly doing texture design. However, there is no one creating meshes. What would be the best course of action to find help?

    Read the article

  • How can I access bitmaps created in another activity?

    - by user22241
    I am currently loading my game bitmaps when the user presses 'start' in my animated splash screen activity (the first / launch activity) and the app progresses from my this activity to the main game activity, This is causing choppy animation in the splashscreen while it loads/creates the bitmaps for the new activity. I've been told that I should load all my bitmaps in one go at the very beginning. However, I can't work out how to do this - could anyone please point me in the right direction? I have 2 activities, a splash screen and the main game. Each consist of a class that extends activity and a class that extends SurfaceView (with an inner class for the rendering / logic updating). So, for example at the moment I am creating my bitmaps in the constructor of my SurfaceView class like so: public class OptionsScreen extends SurfaceView implements SurfaceHolder.Callback { //Create variables here public OptionsScreen(Context context) { Create bitmaps here } public void intialise(){ //This method is called from onCreate() of corresponding application context // Create scaled bitmaps here (from bitmaps previously created) }

    Read the article

  • Pointer problem on external monitor

    - by Herby Pepper
    Pointer looks and work fine on laptop but when I connect external monitor it appears there as a shaking, square shape. Videos are not showing either. my laptop hp 2133: Graphic card: VIA Technologies, Inc. CN896/VN896/P4M900, Chrome 9 HC. System: Lubuntu 12.04 I think it is graphic problem but can't find drivers for my card and system. I do have xserver-xorg-video-openchrome and disper installed. I did not have that problem with Lubuntu 11.10. My problem is a bit like : Mouse pointer strange problem but it was not solved so I decided to post my question.

    Read the article

  • Enabling a multi display desktop completely broke Gnome Shell. Help?

    - by Chintan Parikh
    I've been trying to get my dual desktops working on Ubuntu for a while. I previously had them as one large desktop, but that was incredibly slow for some reason. I tried to switch them to multi display desktop on the AMD Catalyst Control Center. Here's what I get after restarting and logging in: http://i.imgur.com/SEjgU.png I'm running an AMD Quad Core A6, AMD Radeon 6540G2 GPU, 16GB Ram. Ubuntu 12.04 Any ideas?

    Read the article

  • How well does Intel 3000 HD work on Ubuntu?

    - by Simon
    Right now i have notebook with Nvidia 8400M GS (I know, it's not good card) and it's impossible to work normally when i'll plugin external monitor (1920x1080). Windows 7 can deal with it without problems (1440x900 on notebook + 1920x1080 external). On Ubuntu i have to choose one screen and turn off the second one. Even with only one screen Ubuntu (Unity or even Gnome3) sometimes hangs for a while, I've not found solution for this yet, but nevermind, it's probably because of my card or/and nvidia's drivers. I'm going to buy new PC, but for now only with integrated Intel 3000HD, and my question is: Should i expect similar problems with this card? Here i've found link to Intel's webpage about drivers - "only community develop them", and i'm a bit concerned. I'll use then only one monitor (the bigger one), but how well does those driver work? Are there any performance tests?

    Read the article

  • Nvidia Driver versions?

    - by Patrick Krenz
    I've looked all over and can't find any reason as to why or how Nvidia names their drivers. for example they have a 330.xxx/340.xxx series that are current but also a 300.xxx and i've found that they aren't always release in order by number. Here's an example on there site with version and release date 331.38 - January 13 334.16 - Feb 7 331.49 - Feb 18 I'm really confused about what driver to actually go with, a few different series versions seem to work adequately and I just want to have an understanding of it and what the best option to work from would be. I really appreciate any information

    Read the article

  • Ubuntu desktop and dash problems

    - by user170163
    I am not sure if i have posted it to right place, so if no please don't judge me, I am a newbie. I installed Ubuntu 13.04 and I am happy with it. I have two OS's in dual boot. Ubuntu and Windows 7. in Ubuntu I have 2 problems. The first one is when I suspend my system I cannot resume it again. It sometimes show me the password screen (user names and etc. (sorry i don't know the exact name of it)) and sometimes just a blank screen. what can be the problem? The second one is this: When I hit ALT+F2 it looks like this ( but not always. when I restart the system everything is OK till about 30-40 minutes. and then it looks like this. Please help, I really like Ubuntu but these 2 problems ruin my plans about it. Thanks

    Read the article

  • Can't Run Assault Cube

    - by Debashis Pradhan
    I installed assault cube from the Software centre and it just opens for half a second and closes. When i run in it from the terminal, this is what i get - d@d-platform:~$ assaultcube Using home directory: /home/d/.assaultcube_v1.104 current locale: en_IN init: sdl init: net init: world init: video: sdl init: video: mode X Error of failed request: BadValue (integer parameter out of range for operation) Major opcode of failed request: 129 (XFree86-VidModeExtension) Minor opcode of failed request: 10 (XF86VidModeSwitchToMode) Value in failed request: 0xb3 Serial number of failed request: 131 Current serial number in output stream: 133

    Read the article

  • Fatal X server error: Failed to submit to batchbuffer

    - by Jan
    Ubuntu 10.04 Lucid Lynx used to run fine on my computer. Since a few weeks, my X server crashes out of the blue while the computer is idle and I'm logged into a Gnome session. (I'm then greeted with a new GDM login prompt). After the crash, /var/log/gdm/:0.log.1 has the following: Fatal server error: Failed to submit batchbuffer: Input/output error Please consult the The X.Org Foundation support at http://wiki.x.org for help. ~/.xsession-errors.old has symptoms of X clinets dying: nm-applet: Fatal IO error 11 (Die Ressource ist zur Zeit nicht verfügbar) on X server :0.0. dmesg says: [191848.390081] [drm:i915_hangcheck_elapsed] ERROR Hangcheck timer elapsed... GPU hung [191848.390086] render error detected, EIR: 0x00000010 [191848.390088] IPEIR: 0x00000000 [191848.390090] IPEHR: 0x01800002 [191848.390091] INSTDONE: 0xffffffff [191848.390093] INSTPS: 0x8001e020 [191848.390095] INSTDONE1: 0xbfffffff [191848.390097] ACTHD: 0x0a47b014 [191848.390099] page table error [191848.390100] PGTBL_ER: 0x00000002 [191848.390103] [drm:i915_handle_error] ERROR EIR stuck: 0x00000010, masking [191848.390127] [drm:i915_do_wait_request] ERROR i915_do_wait_request returns -5 (awaiting 5617217 at 5617205) Is this a known problem that can be traced back to the X server from Ubuntu repositories? How would I debug this? Edit: There's a relevant bug on LP.

    Read the article

  • How do I plot individual pixels using the XNA APIs?

    - by izb
    If I wanted to fill my game screen with individually coloured pixels, how would I do this? For example, if I wanted to write a 'game of life'-type game where each pixel was a cell, how would I achieve this using XNA? I've tried just calling SetData() on a Texture2D object using a screen-sized array of Color values, but it complains with: You may not call SetData on a resource while it is actively set on the GraphicsDevice. Unset it from the device before calling SetData. How do I do as it asks? Or better still... is there an alternative, better, efficient way to fill a screen with arbitrary pixels?

    Read the article

  • Ubuntu 12.04 - default Radeon driver does not work at all

    - by mumble
    I've recently upgraded to 12.04 LTS and I have an ATI Radeon HD5670. I've heard that the open source 'Radeon' driver is used by default. However, it wasn't showing anything for me. What I did was I added the 'nomodeset' option to boot up and install fglrx. But it didn't work well for me as it introduced a lot of problems (freezes/glitches). So I removed/purged fglrx and am planning to use the open source drivers instead. So my question is this: Why is my default Radeon driver not working? Is anyone having a similar issue? I've also tried using the ubuntu-x-swat driver by running the ff commands: sudo add-apt-repository ppa:ubuntu-x-swat/x-updates sudo apt-get update But the result was the same as the Radeon driver. Nothing shows up on system boot. Any ideas? Thanks in advance! Update Running lspci -nn | grep VGA gives me the following: 02:00.0 VGA compatible controller [0300]: Advanced Micro Devices [AMD] nee ATI Redwood [Radeon HD 5670] [1002:68d8]

    Read the article

  • Understanding IDAT chunk of PNG file format

    - by DRapp
    From the sample image below, I have a border in yellow just for display purposes only. The actual .png file is a simple black/white image 3 pixels by 3 pixels. I was originally thinking to try as a 2x2, but that would not help trying to interpret low/hi vs hi/low drawing stream. At least this way, I would have two black, one white from the top, or one white, two black from the bottom.. So I read the chunks of data, get to the IDAT chunk, decode that (zlib) and come up with 12 bytes as follows 00 20 00 40 00 80 So, my question, how does the above get broken down into the 3x3 black and white sample... Also, it is saved in palette format and properly recognizes the bit depth of 1 and color palette of 2... color pallet[0] is RGBA all zeros. Palette1 has RGBA of 255, 255, 255, 0 I'll eventually get into the multiple other depth formats later, just wanted to start with what would expect to be the easiest. Part II. Any guidance on handling the other depth formats would help if anything special to be considered especially regarding alpha channel (which I am already looking for in the palette) that might trip me up.

    Read the article

  • Can I use one set of images to represent multiple sprites in Java?

    - by mal
    I've got a game that has 3 basic sprites, at the moment I'm loading 8 images into each sprite for animating. Each character class has a sprite object. if I've got 10 characters on screen at once then that's 80 images loaded in to memory. Can I make a central sprite class that only holds 8 images for each of the 3 sprites, then get the character objects to request the relevant images from the central sprite class, thereby massively reducing the memory required for the images?

    Read the article

  • Retexturing a model via API on the web

    - by AndyMcKenna
    I'm looking at creating a site where a user could see our product and configure the options or look of it and see an image that represents that. The way I'm doing it now is if you have Piece A selected and then you choose Texture X, I have an image on the filesystem that is A with X applied to it. I just swap out my default image with that specific one. One product has 8 areas, with 10-70 pieces per area, and about 200 textures for each piece. Programming the site was pretty simple but we are getting bogged down in rendering all these pieces/textures and entering them into the system. What I would love to do is build a model and have some way to apply the textures via API and render it to the browser. I would even settle for exporting a flat image and pulling that into the browser. Is that possible with something like SolidWorks, 3DSMax, or something else? If the rendering is too time intensive it would still help to batch create all my images and use them the way I am currently.

    Read the article

  • fglrx installation without success - gl_conf issue

    - by Lucio
    I followed the steps of this guide. I've installed the drivers without any problems with sudo dpkg -i fglrx*.deb. The next step is Generate a new /etc/X11/xorg.conf file, but I can't do this due to the following reason: When I enter sudo aticonfig --initial -f the terminal show me this output: sudo: aticonfig: command not found This problem is caused by an error with the symbolics links into the fglrx directory. Look at this section, where you can see -how to fix it- but it doesn't work for me. Why it doesn't? Because after I enter sudo update-alternatives --auto gl_conf the terminal show me this: update-alternatives: error: no alternatives for gl_conf. What I have to do to fix this problem? GC: ATI RadeonHD 6670

    Read the article

  • Problems with the colors on my screen. How can I check if it's a hardware problem?

    - by Ingo Gerth
    On my EEE PC netbook some colors are not displayed properly. Specifically, dark gradients such as in the window title do not look very smooth, but rather look like a sequence of a couple of different colors. This is especially visible when opening a menu, for example the "File" menu in Firefox. As you know this one is black, and it looks terrible on my screen. It is not just black, but the colors look really graded and far from a smooth gradient or anything. Now I am wondering: Is this a hardware or software issue? And how can I check this? If it should be a fixable problem I think it would be worth another question. Note that I am using Natty. If my problem description is not good enough I can try to take a photo.

    Read the article

  • techniques for displaying vehicle damage

    - by norca
    I wonder how I can displaying vehicle damage. I am talking about an good way to show damage on screen. Witch kind of model are common in games and what are the benefits of them. What is state of the art? One way i can imagine is to save a set of textures (normal/color/lightmaps, etc) to a state of the car (normal, damage, burnt out) and switch or blending them. But is this really good without changing the model? Another way i was thinking about is preparing animations for different locations on my car, something like damage on the front, on the leftside/rightside or on the back. And start blending the specific animation. But is this working with good textures? Whats about physik engines? Is it usefull to use it for deforming vertexdata? i think losing parts of my car (doors, sirens, weapons) can looks really nice. my game is a kind of rts in a top down view. vehicles are not the really most importend units (its no racing game), but i have quite a lot in. thx for help

    Read the article

  • OpenGL and switchable graphic cards

    - by Orcun
    I use a laptop and this laptop has readon AMD Radeon HD 6470M and onboad graphic card. When I run fglrxinfo, I get this error: X Error of failed request: BadRequest (invalid request code or no such operation) Major opcode of failed request: 136 (GLX) Minor opcode of failed request: 19 (X_GLXQueryServerString) Serial number of failed request: 12 Current serial number in output stream: 12 Is it a problem ? Because of I reason I can't use opengl. Because, I can't run any opengl applications.

    Read the article

  • Why does my VertexDeclaration apparently not contain Position0?

    - by Phil
    I'm trying to get my code from calling each individual draw call down to using at least a VertexBuffer, and preferably an indexBuffer, but now that I'm attempting to test my code, I'm getting the error: The current vertex declaration does not include all the elements required by the current vertex shader. Position0 is missing. Which makes absolutely no sense to me, as my VertexDeclaration is: public readonly static VertexDeclaration VertexDeclaration = new VertexDeclaration( new VertexElement(0, VertexElementFormat.Vector3, VertexElementUsage.Position, 0), new VertexElement(sizeof(float) * 3, VertexElementFormat.Color, VertexElementUsage.Color, 0), new VertexElement(sizeof(float) * 3 + 4, VertexElementFormat.Vector3, VertexElementUsage.Normal, 0) ); Which clearly contains the information. I am attempting to draw with the following lines: VertexBuffer vb = new VertexBuffer(GraphicsDevice, VertexPositionColorNormal.VertexDeclaration, c.VertexList.Count, BufferUsage.WriteOnly); IndexBuffer ib = new IndexBuffer(GraphicsDevice, typeof(int), c.IndexList.Count, BufferUsage.WriteOnly); vb.SetData<VertexPositionColorNormal>(c.VertexList.ToArray()); ib.SetData<int>(c.IndexList.ToArray()); GraphicsDevice.DrawIndexedPrimitives(PrimitiveType.TriangleList, 0, 0, vb.VertexCount, 0, c.IndexList.Count/3); Where c is a Chunk class containing an 8x8x8 array of boxes. Full code is available at https://github.com/mrbaggins/Box/tree/ProperMeshing/box/box. Relevant locations are Chunk.cs (Contains the VertexDeclaration) and Game1.cs (Draw() is in Lines 230-250). Not much else of relevance to this problem anywhere else. Note that large commented sections are from old version of drawing.

    Read the article

  • Rotation matrix for a 3D vector

    - by Shashwat
    I have a direction vector on which I have to apply some rotation to align it to positive z-axis. To use Matrix.CreateRotationX(angle) of XNA, I need the angle for which I'd have to compute cos or tan inverse. I think this is a complex task to do. Also, eventually those are also converted to sin(angle) and cos(angle) in the matrix. Is there any inbuilt way to create rotation matrix from a 3D vector? However, I can write the function but still asking if there is one already there.

    Read the article

< Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >