Search Results

Search found 10050 results on 402 pages for 'graphics card'.

Page 144/402 | < Previous Page | 140 141 142 143 144 145 146 147 148 149 150 151  | Next Page >

  • Ubuntu 12.04 nomodeset fixes boot problem but causes screen resolution to get stuck

    - by Thunder
    I've been searching the askubuntu forum for the past 3 days trying to figure out what's going on with my system and I have tried a lot of things but to no avail. So, I will explain my situation and tell you what I have tried and I hope someone can help me :) I have an: HP Workstation xw4100 Pentium(R) 4 3.00 GHz 1.5 GB RAM NVIDIA Quadro4 380 XGL graphics card It came with Windows XP and I set it up (with WUBI) to dual boot with Ubuntu 12.04 After installation I had the problem that so many people had with it booting to a black screen (mine was actually booting to the terminal basic shell) that is fixed by adding nomodeset into the grub. When I do that, MY screen resolution becomes stuck in 1280x768 (as opposed to 1366x768 before adding nomodeset) (and also, when running XP the best resolution is 1280x720) When I go to "additional drivers" it doesn't show any proprietary drivers, so I manually downloaded them using this command: sudo apt-add-repository ppa:ubuntu-x-swat/x-updates sudo apt-get update sudo apt-get install nvidia-current but after rebooting, that made the graphics even worse (now stuck as 800x600) SO I tried to configure the drivers with sudo nvidia-xconfig but that simply created an empty xorg.config file. I found one place where a guy gave information to manually input into the xorg.config file but that had no effect at all. Lastly I tried to install previous versions of the NVIDIA drivers, but they wouldn't even fully install. So now I have just re-installed Ubuntu 12.04 and I either need to find a better solution to the first problem (nomodeset) or get the nouveau driver to correctly configure to work with my nvidia graphics. Thanks for your help ahead of time!

    Read the article

  • USB device not accepting address

    - by Mike Williamson
    I have a series of machines that I am building for work that have usb card readers. When I boot them I get a long series of messages: ... [ 2347.768419] hub 1-6:1.0: unable to enumerate USB device on port 6 [ 2347.968178] usb 1-6.6: new full-speed USB device number 10 using ehci_hcd [ 2352.552020] usb 1-6.6: device not accepting address 10, error -32 [ 2352.568421] hub 1-6:1.0: unable to enumerate USB device on port 6 [ 2352.768179] usb 1-6.6: new full-speed USB device number 12 using ehci_hcd [ 2357.352033] usb 1-6.6: device not accepting address 12, error -32 ... On some older machines this only takes a few attempts before the card reader finally accepts an address, while on newer machines it can take many minutes. Changing hardware is not an option and plugging the usb card reader into a different port is only an option for the older manchines. This was a problem under 11.04 and I am now running the 12.04 beta and its still happening. Is there something I can do in the software (a udev rule perhaps?) that would fix this? Any advice appreciated. I'm happy to provide more details if you need them.

    Read the article

  • Having troubles with LibNoise.XNA and generating tileable maps

    - by Jon
    Following up on my previous post, I found a wonderful port of LibNoise for XNA. I've been working with it for about 8 hours straight and I'm tearing my hair out - I just can not get maps to tile, I can't figure out how to do this. Here's my attempt: Perlin perlin = new Perlin(1.2, 1.95, 0.56, 12, 2353, QualityMode.Medium); RiggedMultifractal rigged = new RiggedMultifractal(); Add add = new Add(perlin, rigged); // Initialize the noise map int mapSize = 64; this.m_noiseMap = new Noise2D(mapSize, perlin); //this.m_noiseMap.GeneratePlanar(0, 1, -1, 1); // Generate the textures this.m_noiseMap.GeneratePlanar(-1,1,-1,1); this.m_textures[0] = this.m_noiseMap.GetTexture(this.graphics.GraphicsDevice, Gradient.Grayscale); this.m_noiseMap.GeneratePlanar(mapSize, mapSize * 2, mapSize, mapSize * 2); this.m_textures[1] = this.m_noiseMap.GetTexture(this.graphics.GraphicsDevice, Gradient.Grayscale); this.m_noiseMap.GeneratePlanar(-1, 1, -1, 1); this.m_textures[2] = this.m_noiseMap.GetTexture(this.graphics.GraphicsDevice, Gradient.Grayscale); The first and third ones generate fine, they create a perlin noise map - however the middle one, which I wanted to be a continuation of the first (As per my original post), is just a bunch of static. How exactly do I get this to generate maps that connect to each other, by entering in the mapsize * tile, using the same seed, settings, etc.?

    Read the article

  • How to SEO Optimize Javascript Image Loader?

    - by skibulk
    I am building an image-centric catalog website. It catalogs collectible gaming cards numbering 100,000+ pages. Competitor sites recieve millions of hits each month, so with the possibility of excessive traffic, I need to moderate image bandwidth while also optimizing for image SEO. I'm looking for some tips on doing so. Each page on the site features one card with appropriate tags and descriptions. There are however four images for each card - one on matte cardstock, one on foil cardstock, one digital, and one digital foil. In a world with unlimited bandwidth and no-wait page loads, I'd simply embed all four images on the main product page with titles, alt tags, and captions to rank them according to their version keyword. In reality a javascript gallery image loader seems appropriate. Here is a simplified example of my current code. Would this affect SEO in any way? Should I be doing anything differently? Note that I don't want to create a page for each image as I'd have to duplicate the card tags and descriptions on each one, diluting PR for the main page. Thanks for any insight! <script type="text/javascript"> document.write(' <img src="thumbnail1.jpg" data-src="version1.jpg"> <img src="thumbnail2.jpg" data-src="version2.jpg"> <img src="thumbnail3.jpg" data-src="version3.jpg"> <img src="thumbnail4.jpg" data-src="version4.jpg"> '); </script> <noscript> <img src="version1.jpg"> <img src="version2.jpg"> <img src="version3.jpg"> <img src="version4.jpg"> </noscript>

    Read the article

  • Xna, after mouse click cpu usage goes 100%

    - by kosnkov
    Hi i have following code and it is enough just if i click on blue window then cpu goes to 100% for like at least one minute even with my i7 4 cores. I just check even with empty project and is the same !!! public class Game1 : Microsoft.Xna.Framework.Game { GraphicsDeviceManager graphics; SpriteBatch spriteBatch; private Texture2D cursorTex; private Vector2 cursorPos; GraphicsDevice device; float xPosition; float yPosition; public Game1() { graphics = new GraphicsDeviceManager(this); Content.RootDirectory = "Content"; } protected override void Initialize() { Viewport vp = GraphicsDevice.Viewport; xPosition = vp.X + (vp.Width / 2); yPosition = vp.Y + (vp.Height / 2); device = graphics.GraphicsDevice; base.Initialize(); } protected override void LoadContent() { spriteBatch = new SpriteBatch(GraphicsDevice); cursorTex = Content.Load<Texture2D>("strzalka"); } protected override void UnloadContent() { // TODO: Unload any non ContentManager content here } protected override void Update(GameTime gameTime) { // Allows the game to exit if (GamePad.GetState(PlayerIndex.One).Buttons.Back == ButtonState.Pressed) this.Exit(); base.Update(gameTime); } protected override void Draw(GameTime gameTime) { GraphicsDevice.Clear(Color.CornflowerBlue); spriteBatch.Begin(); spriteBatch.Draw(cursorTex, cursorPos, Color.White); spriteBatch.End(); base.Draw(gameTime); } }

    Read the article

  • ATI Radeon 5800 series dual monitor unity not 3D accelerated

    - by Victor S
    When I had a single monitor setup, without Xinerama, with my current setup of Ubuntu 11.10, ATI 5800 series card, Unity showed transparencies, shadows, etc. (although graphics was reported as 'Standard' in the control/settings panel). Having switched to a dual monitor setup, dell 24" UltraSharp and a smaller Acer monitor, Unity shows only as 2d, even though I'm not loggin in to that display manager. WebGL performance is very sluggish, I'm getting the impression that the processor is doing all the work and the card isn't even accessible even though the drivers are installed (from the ubuntu repository, I did not compile custom drivers). Any tips on how to enable full 3D accelerationn and video card support. Here is my xorg.conf file: Section "Monitor" Identifier "0-DFP3" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" Option "PreferredMode" "1680x1050" Option "TargetRefresh" "60" Option "Position" "0 0" Option "Rotate" "normal" Option "Disable" "false" EndSection Section "Monitor" Identifier "0-DFP4" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" Option "PreferredMode" "1920x1200" Option "TargetRefresh" "60" Option "Position" "0 0" Option "Rotate" "normal" Option "Disable" "false" EndSection Section "Screen" Identifier "Default Screen" DefaultDepth 24 SubSection "Display" EndSubSection EndSection Section "Screen" Identifier "amdcccle-Screen[1]-0" Device "amdcccle-Device[1]-0" DefaultDepth 24 SubSection "Display" Viewport 0 0 Depth 24 EndSubSection EndSection Section "Screen" Identifier "amdcccle-Screen[1]-1" Device "amdcccle-Device[1]-1" DefaultDepth 24 SubSection "Display" Viewport 0 0 Depth 24 EndSubSection EndSection Section "Module" Load "glx" EndSection Section "ServerLayout" Identifier "amdcccle Layout" Screen 0 "amdcccle-Screen[1]-0" 0 0 Screen "amdcccle-Screen[1]-1" 1920 0 EndSection Section "Device" Identifier "amdcccle-Device[1]-0" Driver "fglrx" Option "Monitor-DFP4" "0-DFP4" BusID "PCI:1:0:0" EndSection Section "Device" Identifier "amdcccle-Device[1]-1" Driver "fglrx" Option "Monitor-DFP3" "0-DFP3" BusID "PCI:1:0:0" Screen 1 EndSection Section "ServerFlags" Option "Xinerama" "on" EndSection More info: fglrxinfo display: :0 screen: 0 OpenGL vendor string: ATI Technologies Inc. OpenGL renderer string: ATI Radeon HD 5800 Series OpenGL version string: 4.1.11005 Compatibility Profile Context

    Read the article

  • Logitech C510 HD Webcam related question

    - by Ashfame
    I am going to buy Logitech C510 HD webcam and I just checked on other questions here on AskUbuntu that it works out of the box with cheese. My question is can it be limited in any functionality that I would like to do with it? I would like it to be used with everything - Skype, Gtalk video chat, Facebook, Youtube etc I would like the ability to record or do a video call in lesser resolution video (its a 720p one). Also since I read that I should have a Core2Duo 2.2Ghz for 720p but I have a 2.0Ghz one, would it be possible for me to first record it and then encode it after recording if my processor really start giving issue when doing on-the-fly encoding? Anything else that I should consider? I also have a ATI HD 4850 512MB card, if it can help in encoding on-the-fly or is there a chance that my graphics card alone can handle it and those specs were just for a system without a graphics card? I believe so. Also, I got no worries in dealing with console, if I have to do some of the things above in terminal. Other possible significant details: I have a dual screen setup 29" (1360X768) & 22" (1680X1050) which might be using some good power from GPU and I have 2GB DDR2 800Mhz RAM.

    Read the article

  • First ATMs programming language

    - by revo
    First ATMs performed tasks like a cash dispenser, they were offline machines which worked with punch cards impregnated with Carbon and a 6-digit PIN code. Maximum withdrawal with a card was 10 pounds and each one was a one-time use card - ATM swallowed cards! The first ATM was installed in London in the year 1967, as I looked at time line of programming languages, there were many programming languages made before that decade. I don't know about the hardware neither, but in which programming language it was written? *I didn't find a detailed biography of John Shepherd-Barron (ATM inventor at 70s) Update I found this picture, which is taken from a newspaper back to the year 1972 in Iran. Translated PS : Shows Mr. Rad-lon (if spelled correctly), The manager of Barros (if spelled correctly) International Educational Institute in United Kingdom at the right, and Mr. Jim Sutherland - Expert of Computer Kiosks. In the rest of the text I found on this paper, these kind of ATMs which called "Automated Computer Kiosk" were advertised with this: Mr. Rad-lon (if spelled correctly) puts his card to one specific location of Automated Computer Kiosk and after 10 seconds he withdraws his cash. Two more questions are: 1- How those ATMs were so fast? (withdrawal in 10 seconds in that year) 2- I didn't find any text on Internet which state about "Automated Computer Kiosk", Is it valid or were they being called Computer in that time?

    Read the article

  • How to make a player stay within bounds of world with 2D Camera

    - by Craig
    Im creating a simple top down survival game. At the moment, i have the sprite which is a ship and moves by rotating left or right then going forward in that direction. I have implemented a 2D camera, its always centered on the player. However, when i move towards the bounds of the world that the sprite is in it just keeps on going :( How to i sort it that it stops at the edge of the world and cant go beyond it? Cheers :) Below is the main game class using System; using System.Collections.Generic; using System.Linq; using Microsoft.Xna.Framework; using Microsoft.Xna.Framework.Audio; using Microsoft.Xna.Framework.Content; using Microsoft.Xna.Framework.GamerServices; using Microsoft.Xna.Framework.Graphics; using Microsoft.Xna.Framework.Input; using Microsoft.Xna.Framework.Media; namespace GamesCoursework_1 { /// <summary> /// This is the main type for your game /// </summary> public class Game1 : Microsoft.Xna.Framework.Game { GraphicsDeviceManager graphics; SpriteBatch spriteBatch; // player variables Texture2D Ship; Vector2 Ship_Position; float Ship_Rotation = 0.0f; Vector2 Ship_Origin; Vector2 Ship_Velocity; const float tangentialVelocity = 4f; float friction = 0.05f; static Point CameraViewport = new Point(800, 800); Camera2d cam = new Camera2d((int)CameraViewport.X, (int)CameraViewport.Y); //Size of world static Point worldSize = new Point(1600, 1600); // Screen variables static Point worldCenter = new Point(worldSize.X / 2, worldSize.Y / 2); Rectangle playerBounds = new Rectangle(CameraViewport.X / 2, CameraViewport.Y / 2, worldSize.X - CameraViewport.X, worldSize.Y - CameraViewport.Y); Rectangle worldBounds = new Rectangle(0, 0, worldSize.X, worldSize.Y); Texture2D background; public Game1() { graphics = new GraphicsDeviceManager(this); graphics.PreferredBackBufferWidth = CameraViewport.X; graphics.PreferredBackBufferHeight = CameraViewport.Y; Content.RootDirectory = "Content"; } /// <summary> /// Allows the game to perform any initialization it needs to before starting to run. /// This is where it can query for any required services and load any non-graphic /// related content. Calling base.Initialize will enumerate through any components /// and initialize them as well. /// </summary> protected override void Initialize() { // TODO: Add your initialization logic here base.Initialize(); } /// <summary> /// LoadContent will be called once per game and is the place to load /// all of your content. /// </summary> protected override void LoadContent() { // Create a new SpriteBatch, which can be used to draw textures. spriteBatch = new SpriteBatch(GraphicsDevice); // TODO: use this.Content to load your game content here Ship = Content.Load<Texture2D>("Ship"); Ship_Origin.X = Ship.Width / 2; Ship_Origin.Y = Ship.Height / 2; background = Content.Load<Texture2D>("aus"); Ship_Position = new Vector2(worldCenter.X, worldCenter.Y); cam.Pos = Ship_Position; cam.Zoom = 1f; } /// <summary> /// UnloadContent will be called once per game and is the place to unload /// all content. /// </summary> protected override void UnloadContent() { // TODO: Unload any non ContentManager content here } /// <summary> /// Allows the game to run logic such as updating the world, /// checking for collisions, gathering input, and playing audio. /// </summary> /// <param name="gameTime">Provides a snapshot of timing values.</param> protected override void Update(GameTime gameTime) { // Allows the game to exit if (GamePad.GetState(PlayerIndex.One).Buttons.Back == ButtonState.Pressed) this.Exit(); // TODO: Add your update logic here Ship_Position = Ship_Velocity + Ship_Position; keyPressed(); base.Update(gameTime); } /// <summary> /// This is called when the game should draw itself. /// </summary> /// <param name="gameTime">Provides a snapshot of timing values.</param> protected override void Draw(GameTime gameTime) { GraphicsDevice.Clear(Color.CornflowerBlue); // TODO: Add your drawing code here spriteBatch.Begin(SpriteSortMode.Deferred, BlendState.AlphaBlend, null, null, null,null, cam.get_transformation(GraphicsDevice)); spriteBatch.Draw(background, Vector2.Zero, Color.White); spriteBatch.Draw(Ship, Ship_Position, Ship.Bounds, Color.White, Ship_Rotation, Ship_Origin, 1.0f, SpriteEffects.None, 0f); spriteBatch.End(); base.Draw(gameTime); } private void Ship_Move(Vector2 move) { Ship_Position += move; } private void keyPressed() { KeyboardState keyState; // Move right keyState = Keyboard.GetState(); if (keyState.IsKeyDown(Keys.Right)) { Ship_Rotation = Ship_Rotation + 0.1f; } if (keyState.IsKeyDown(Keys.Left)) { Ship_Rotation = Ship_Rotation - 0.1f; } if (keyState.IsKeyDown(Keys.Up)) { Ship_Velocity.X = (float)Math.Cos(Ship_Rotation) * tangentialVelocity; Ship_Velocity.Y = (float)Math.Sin(Ship_Rotation) * tangentialVelocity; if ((int)Ship_Position.Y < playerBounds.Bottom && (int)Ship_Position.Y > playerBounds.Top) cam._pos.Y = Ship_Position.Y; if ((int)Ship_Position.X > playerBounds.Left && (int)Ship_Position.X < playerBounds.Right) cam._pos.X = Ship_Position.X; //tried world bounds here if (!worldBounds.Contains(new Point((int)Ship_Position.X, (int)Ship_Position.Y))) Ship_Position -= new Vector2(0.0f, -tangentialVelocity * 2); if (!worldBounds.Contains(new Point((int)Ship_Position.X, (int)Ship_Position.Y))) Ship_Position -= new Vector2(0.0f, 2 * tangentialVelocity); } else if(Ship_Velocity != Vector2.Zero) { float i = Ship_Velocity.X; float j = Ship_Velocity.Y; Ship_Velocity.X = i -= friction * i; Ship_Velocity.Y = j -= friction * j; if ((int)Ship_Position.Y < playerBounds.Bottom && (int)Ship_Position.Y > playerBounds.Top) cam._pos.Y = Ship_Position.Y; if ((int)Ship_Position.X > playerBounds.Left && (int)Ship_Position.X < playerBounds.Right) cam._pos.X = Ship_Position.X; } if (keyState.IsKeyDown(Keys.Q)) { if (cam.Zoom < 2f) cam.Zoom += 0.05f; } if (keyState.IsKeyDown(Keys.A)) { if (cam.Zoom > 0.3f) cam.Zoom -= 0.05f; } } } } my 2d camera class using System; using System.Collections.Generic; using System.Linq; using System.Text; using Microsoft.Xna.Framework; using Microsoft.Xna.Framework.Graphics; namespace GamesCoursework_1 { public class Camera2d { protected float _zoom; // Camera Zoom public Matrix _transform; // Matrix Transform public Vector2 _pos; // Camera Position protected float _rotation; // Camera Rotation public int _viewportWidth, _viewportHeight; // viewport size public Camera2d(int ViewportWidth, int ViewportHeight) { _zoom = 1.0f; _rotation = 0.0f; _pos = Vector2.Zero; _viewportWidth = ViewportWidth; _viewportHeight = ViewportHeight; } // Sets and gets zoom public float Zoom { get { return _zoom; } set { _zoom = value; if (_zoom < 0.1f) _zoom = 0.1f; } // Negative zoom will flip image } public float Rotation { get { return _rotation; } set { _rotation = value; } } // Auxiliary function to move the camera public void Move(Vector2 amount) { _pos += amount; } // Get set position public Vector2 Pos { get { return _pos; } set { _pos = value; } } public Matrix get_transformation(GraphicsDevice graphicsDevice) { _transform = // Thanks to o KB o for this solution Matrix.CreateTranslation(new Vector3(-_pos.X, -_pos.Y, 0)) * Matrix.CreateRotationZ(Rotation) * Matrix.CreateScale(new Vector3(Zoom, Zoom, 1)) * Matrix.CreateTranslation(new Vector3(_viewportWidth * 0.5f, _viewportHeight * 0.5f, 0)); return _transform; } } }

    Read the article

  • How do I get 1366x768 resolution on 12.04?

    - by Megan
    I am on an HP Envy 14, and the proper resolution that I should be using is 1366x768. This is not an option and I am stuck on 1024x768. I am using Linux 12.04. lspci | grep VGA: 00:02.0 VGA compatible controller: Intel Corporation Core Processor Integrated Graphics Controller (rev 02) 01:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Madison [Radeon HD 5000M Series] I've tried to add the resolution as a mode in xorg.conf but that does not work. Please any help would be appreciated. I'm new to Linux and just got my dual boot working but this resolution issue is killing me. Edit1 I just tried using the xrandr command: xrandr --newmode "1368x768_60.00" 85.25 1368 1440 1576 1784 768 771 781 798 -hsync +vsync But I get an error: xrandr: Failed to get size of gamma for output default Edit2 lsmod returns the following: Module Size Used by vesafb 13844 1 rfcomm 47604 12 bnep 18281 2 parport_pc 32866 0 ppdev 17113 0 snd_hda_codec_hdmi 32474 1 arc4 12529 2 joydev 17693 0 hid_logitech_dj 18594 0 i915 472941 5 uvcvideo 72627 0 usbhid 47199 1 hid_logitech_dj hid 99559 2 hid_logitech_dj,usbhid psmouse 87692 0 iwlwifi 332525 0 mac80211 506816 1 iwlwifi videodev 98259 1 uvcvideo snd_hda_codec_idt 70795 1 mei 41616 0 btusb 18288 2 v4l2_compat_ioctl32 17128 1 videodev hp_accel 25976 0 lis3lv02d 19876 1 hp_accel hp_wmi 18092 0 sparse_keymap 13890 1 hp_wmi input_polldev 13896 1 lis3lv02d drm_kms_helper 46978 1 i915 drm 242038 2 i915,drm_kms_helper i2c_algo_bit 13423 1 i915 serio_raw 13211 0 snd_hda_intel 33773 5 snd_hda_codec 127706 3 snd_hda_codec_hdmi,snd_hda_codec_idt,snd_hda_intel snd_hwdep 13668 1 snd_hda_codec bluetooth 180104 23 rfcomm,bnep,btusb cfg80211 205544 2 iwlwifi,mac80211 snd_pcm 97188 3 snd_hda_codec_hdmi,snd_hda_intel,snd_hda_codec mac_hid 13253 0 snd_seq_midi 13324 0 snd_rawmidi 30748 1 snd_seq_midi snd_seq_midi_event 14899 1 snd_seq_midi snd_seq 61896 2 snd_seq_midi,snd_seq_midi_event fglrx 3263886 0 snd_timer 29990 2 snd_pcm,snd_seq snd_seq_device 14540 3 snd_seq_midi,snd_rawmidi,snd_seq snd 78855 20 snd_hda_codec_hdmi,snd_hda_codec_idt,snd_hda_intel,snd_hda_codec,snd_hwdep,snd_pcm,snd_ra wmidi,snd_seq,snd_timer,snd_seq_device wmi 19256 1 hp_wmi video 19596 1 i915 intel_ips 18174 0 soundcore 15091 1 snd snd_page_alloc 18529 2 snd_hda_intel,snd_pcm lp 17799 0 parport 46562 3 parport_pc,ppdev,lp r8169 62099 0 I have installed ATI/AMD proprietary FGLRX graphics driver. But there is another one called ATI/AMD proprietary FGLRX graphics driver (post-release updates) which I have trouble installing because it gives me an error and tells me to look at some sort of jockey log.

    Read the article

  • How do I draw a dotted or dashed line?

    - by Gagege
    I'm trying to draw a dashed or dotted line by placing individual segments(dashes) along a path and then separating them. The only algorithm I could come up with for this gave me a dash length that was variable based on the angle of the line. Like this: private function createDashedLine(fromX:Float, fromY:Float, toX:Float, toY:Float):Sprite { var line = new Sprite(); var currentX = fromX; var currentY = fromY; var addX = (toX - fromX) * 0.0075; var addY = (toY - fromY) * 0.0075; line.graphics.lineStyle(1, 0xFFFFFF); var count = 0; // while line is not complete while (!lineAtDestination(fromX, fromY, toX, toY, currentX, currentY)) { /// move line draw cursor to beginning of next dash line.graphics.moveTo(currentX, currentY); // if dash is even if (count % 2 == 0) { // draw the dash line.graphics.lineTo(currentX + addX, currentY + addY); } // add next dash's length to current cursor position currentX += addX; currentY += addY; count++; } return line; } This just happens to be written in Haxe, but the solution should be language neutral. What I would like is for the dash length to be the same no matter what angle the line is. As is, it's just adding 75 thousandths of the line length to the x and y, so if the line is and a 45 degree angle you get pretty much a solid line. If the line is at something shallow like 85 degrees then you get a nice looking dashed line. So, the dash length is variable, and I don't want that. How would I make a function that I can pass a "dash length" into and get that length of dash, no matter what the angle is? If you need to completely disregard my code, be my guest. I'm sure there's a better solution.

    Read the article

  • My computer will not reboot after fresh install of ubuntu 12.04LTS

    - by user170715
    I bought a new computer yesterday and it came with Windows 8. When installing Ubuntu, i choose the erase and install option thinking that Ubuntu would install easily like it did for my old laptop... After a successful install and following the instructions telling me to reboot to finish installation and remove installation media. It worked and my computer booted fine, however once I began installing updates via update manager and activating additional driver {ATI/AMD proprietary FGLRX graphics driver (post-release updates)} out of the following: Experimental AMD binary Xorg driver and kernel module ATI/AMD proprietary FGLRX graphics driver (*experimental*beta) ATI/AMD proprietary FGLRX graphics driver (post-release updates) Then reboot to finish making changes I reboot and get an error (Reboot and select proper boot device) At this point I was stuck, so I eventually reinstalled ubuntu and repeated the exacted same steps until right before i rebooted to finish making changes. However this time i used this Boot Repair tool sudo add-apt-repository ppa:yannubuntu/boot-repair sudo apt-get update sudo apt-get install -y boot-repair boot-repair After running the program i get a "boot successfully repaired" message. Then I try to reboot again and get the GNU Grub screen where it says would you like to boot: normal recovery memorytest Once it begins loading, you see the code moving across the screen then it pauses when it gets to and doesnt do anything. If someone could tell me how to fix this or get Windows 8 back soon, I'd appreciate it because like i said i just bought it yesterday and now i cant even use it.

    Read the article

  • XNA 4.0 - Purple/Pink Tint Over All Sprites After Viewing in FullScreen

    - by D. Dubya
    I'm a noob to the game dev world and recently finished the 2D XNA tutorial from http://www.pluralsight.com. Everything was perfect until I decided to try the game in Fullscreen mode. The following code was added to the Game1 constructor. graphics.PreferredBackBufferWidth = 800; graphics.PreferredBackBufferHeight = 480; graphics.IsFullScreen = true; As soon as it launched in Fullscreen, I noticed that the entire game was tinted. None of the colours were appearing as they should. That code was removed, the game then launched in the 800x480 window, however the tint remained. I commented out all my Draw code so that all that was left was GraphicsDevice.Clear(Color.CornflowerBlue); //spriteBatch.Begin(); //gameState.Draw(spriteBatch, false); //spriteBatch.End(); //spriteBatch.Begin(SpriteSortMode.Deferred, BlendState.Additive); //gameState.Draw(spriteBatch, true); //spriteBatch.End(); base.Draw(gameTime); The result was an empty window that was tinted Purple, not Blue. I changed the GraphicsDevice.Clear colour to Color.White and the window was tinted Pink. Color.Transparent gave a Black window. Even tried rebooting my PC but the 'tint' still remains. I'm at a loss here.

    Read the article

  • Second Monitor Detected, but not receiving a signal after upgrading to 12.04

    - by user62458
    After I upgraded to 12.04, my second monitor is detected (in display settings), but will not power on. I have scoured the Internet and forums for a solution and I can't find anything. I have found a couple people with the same problem, but never a solution for it. I am no expert, but I'm certainly not a noob. My computer uses AMD Radeon 6250 graphics, but I do NOT want to use the proprietary graphics drivers. They refuse to work properly with my second monitor (they ATI drivers will only mirror screens, and I've done everything to try to fix it, and I DON't want mirrored screens) Not to mention that the default open-source video drivers seem to work much better than the proprietary anyway! Again, Ubuntu's default video drivers work fine, and they even DETECT the second monitor (Dell 19'). I can drag stuff off the screen and put it on the 'space' of the second monitor and even a screen-shot shows that there are two monitors active; but the monitor is OFF. It will not power on. It goes into 'power-save' mode because it is not receiving a signal. For some reason it is not getting the signal to power on, even though Ubuntu thinks the monitor is working properly. I had this working fine on my Sony VAIO yesterday (with Radeon graphics/default Ubuntu video drivers). I upgraded to a Samsung Series 3 and now I have this issue. I can't for the life of me figure out why the monitor is connected, detected and I have screen space for the monitor, but the screen won't turn on! XRANDR Output: Screen 0: minimum 320 x 200, current 1366 x 768, maximum 8192 x 8192 VGA-0 connected (normal left inverted right x axis y axis) 1440x900 59.9 + 75.0 1280x1024 75.0 60.0 1152x864 75.0 1024x768 75.1 70.1 60.0 832x624 74.6 800x600 72.2 75.0 60.3 56.2 640x480 72.8 75.0 66.7 60.0 720x400 70.1 LVDS connected 1366x768+0+0 (normal left inverted right x axis y axis) 344mm x 194mm 1366x768 60.1*+ 1280x720 59.9 1152x768 59.8 1024x768 59.9 800x600 59.9 848x480 59.7 720x480 59.7 640x480 59.4 HDMI-0 disconnected (normal left inverted right x axis y axis)

    Read the article

  • AS3 How to check on non transparent pixels in a bitmapdata?

    - by Opoe
    I'm still working on my window cleaning game from one of my previous questions I marked a contribution as my answer, but after all this time I can't get it to work and I have to many questions about this so I decided to ask some more about it. As a sequel on my mentioned previous question, my question to you is: How can I check whether or not a bitmapData contains non transparent pixels? Subquestion: Is this possible when the masked image is a movieclip? Shouldn't I use graphics instead? Information I have: A dirtywindow movieclip on the bottom layer and a clean window movieclip on layer 2(mc1) on the layer above. To hide the top layer(the dirty window) I assign a mask to it. Code // this creates a mask that hides the movieclip on top var mask_mc:MovieClip = new MovieClip(); addChild(mask_mc) //assign the mask to the movieclip it should 'cover' mc1.mask = mask_mc; With a brush(cursor) the player wipes of the dirt ( actualy setting the fill from the mask to transparent so the clean window appears) //add event listeners for the 'brush' brush_mc.addEventListener(MouseEvent.MOUSE_DOWN,brushDown); brush_mc.addEventListener(MouseEvent.MOUSE_UP,brushUp); //function to drag the brush over the mask function brushDown(dragging:MouseEvent):void{ dragging.currentTarget.startDrag(); MovieClip(dragging.currentTarget).addEventListener(Event.ENTER_FRAME,erase) ; mask_mc.graphics.moveTo(brush_mc.x,brush_mc.y); } //function to stop dragging the brush over the mask function brushUp(dragging:MouseEvent):void{ dragging.currentTarget.stopDrag(); MovieClip(dragging.currentTarget).removeEventListener(Event.ENTER_FRAME,erase); } //fill the mask with transparant pixels so the movieclip turns visible function erase(e:Event):void{ with(mask_mc.graphics){ beginFill(0x000000); drawRect(brush_mc.x,brush_mc.y,brush_mc.width,brush_mc.height); endFill(); } }

    Read the article

  • Never before had a problem with Ubuntu desktop graphical display; Trying to use nvidia GT630

    - by focaccio
    I've been using ubuntu since 9.04 and never had a problem with Ubuntu brining up the desktop graphical user interface. However I am currently not able to see anything graphical past the install screens. I have an Intel DP55KG motherboard and just installed an nvidia gt630 graphics card (zotac), since the old graphics card failed. I can install the server and see text. So I do a apt-get install ubuntu-desktop...or apt-get install kubuntu-desktop...or apt-get install xubuntu desktop, but after the reboot there is no display...its like something is hung up. I tried using the Live quantal dvd and I do see the graphical prompt to try without installing, but after that the screen goes blank. I've tried two monitors and the same thing happens. There is a faint "glow" on the screen and I do not get a "no input signal" from the monitor, so something is happening. I can install an old OEM of XP so I know the video card and motherboard are at least semi functional. Any help is appreciated. Thanks, Greg

    Read the article

  • libGDX using Stage and Actor produces different camera angles on desktop and Android Phone

    - by Brandon
    libGDX using Stage and Actor produces different camera angles on desktop and Android Phone. Here are pictures demonstrating the problem: http://brandonyuh.minus.com/mFpdTSgN17VUq On the desktop version, the image takes up most all the screen. On the Android phone it only takes up a bit of the screen. Here's the code (not my actual project but I isolated the problem): package com.me.mygdxgame2; import com.badlogic.gdx.*; import com.badlogic.gdx.graphics.*; import com.badlogic.gdx.graphics.Texture.TextureFilter; import com.badlogic.gdx.graphics.g2d.*; import com.badlogic.gdx.scenes.scene2d.*; public class MyGdxGame2 implements ApplicationListener { private Stage stage; public void create() { stage = new Stage(); stage.addActor(new ActorHi()); } public void render() { Gdx.gl.glClearColor(0, 1, 0, 1); Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT); stage.draw(); } public void dispose() {} public void resize(int width, int height) {} public void pause() {} public void resume() {} public class ActorHi extends Actor { private Sprite sprite; public ActorHi() { Texture texture = new Texture(Gdx.files.internal("data/hi.png")); texture.setFilter(TextureFilter.Linear, TextureFilter.Linear); sprite = new Sprite(new TextureRegion(texture, 0, 0, 128, 128)); sprite.setBounds(0, 0, 300.0f, 300.0f); } public void draw(SpriteBatch batch, float parentAlpha) { sprite.draw(batch); } } } hi.png is included in the above link Thank you very much for answering my question. I've spent 3 days trying to figure it out.

    Read the article

  • Graphical glitches on grub and ubuntu desktop

    - by Klyn
    I've decided to install ubuntu but neither ubuntu or any other linux distro won't even get to the desktop screen or work after getting there. On windows 8, everything is just fine. my new video card works perfectly and I have no problem with anything about it. then when I try to boot from ubuntu with wubi or with usb everything goes like this: 1) Grub screen...no problem at all, colors are just fine everything looks okay 2) and then linux boot screen...weird background color, over the backround there are vertical stripes of red-orange dots. but on the ubuntu logo and text, there are no dots at all! -I mean its shape is perfect- 3) desktop is about the start but * vertical stripes of red colored dots are all over the unity screen*. then when I click on ubuntu's menu, it usually switches to black screen saying something about "panic occured"...and then it restarts or it gives no respond at all. problems started after putting hd 6570 video card on my asus m5a78lm-lx video card which has amd phenom II X4 processor on it. I've searched to find something but there was no similar question that's why I'm almost sure it is kind of unique. again, I'm writing on Windows 8 right now and everything works and looks perfect. so far I've updated bios and anyone knows anything to solve this?

    Read the article

  • Can I use my prepaid phone balance (in pesos) to buy from the Software Centre?

    - by obetus
    Using local network broadband, we can use it to buy games and applications from load balance. Is there any possible ways to use it also in Ubuntu software Center? additional: I'm using mobile broadband for the internet connection,this broadband has a sim card and account number where you can download money from buying a prepaid card worth 100 pesos,300 pesos or 500 pesos, provided by our local network. We use this mobile broadband when there is no wifi connection. There are two kinds of mobile broadband, one is postpaid account and the other is prepaid account. I use prepaid account, this kind of account can load a money for transaction like data plans, from 10 pesos for 30 minutes internet connection or 200 pesos for 5 days internet connection., and this prepaid account can load 5 pesos up to thousands of pesos. Now, if this prepaid mobile broadband can provide money in pesos and has internet connection, I think it can also use it for buying goods or applications or games via internet. i think its only need a software that can detect the sim card number and the money balance for transactions. Sorry for my bad english but I hope you got my point.

    Read the article

  • Android - big game universe

    - by user1641923
    I am new to an Android development, though I have much experience with Java, C++, PHP programming and a bit experience with vector graphics too (basic 3d Studio Max, Flash, etc). I am starting to work on an Android game. It is going to be a 2D space shooter/RPG, and I am not going to use any game engines and any 3D party libs. I really want to create a very large game universe, or even pseudo-infinite (without visible borders, as if it were a 2D projection of a sphere). It should include 10-12 clusters of 7-8 planets/other space objects and random amount of single asteroids/comets, which player can interact with and also not interactive background. I am looking for a least complicated aproach to create such a universe. My current ideas are: Simply create bitmaps with space scenery background so that they can be tiled seamlessly repeated and construct my 2D universe of this tiles, then place interactive objects (planets, other spaceships) on it. Using vector graphics. I would have a solid color background, some random background objects and gradients here and there. My problems here: Lack of knowledge of how well vector graphics is integrated in Android. Performance? Memory usage? Does Android manage big bitmaps well? Do all of the bitmaps have to be in memory during all game process? I am interested in technical details regarding each of the ideas and a suggestion, which I should go with.

    Read the article

  • Can't get Unity 3D to work in 11.10

    - by pmoseph
    I recently upgraded to 11.10 on my Lenovo ThinkPad T520, and I'm not able to load Unity 3D (I'm not selecting 2D at login menu either). me@mycomp:~$ echo $DESKTOP_SESSION ubuntu-2d I ran the unity support test below as well. me@mycomp:~$ /usr/lib/nux/unity_support_test -p Xlib: extension "GLX" missing on display ":0.0". Xlib: extension "GLX" missing on display ":0.0". Xlib: extension "GLX" missing on display ":0.0". Error: unable to create the OpenGL context And it looks like I only have one graphics card: me@mycomp:~$ lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) Also, Ubuntu lists nothing under the "Additional Drivers" window. Any help would be extremely appreciated as I'm somewhat of a noob. Thanks! Edit 1: Here is the output of lshw -C display me@mycomp:~$ sudo lshw -C display *-display description: VGA compatible controller product: 2nd Generation Core Processor Family Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 09 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:43 memory:f0000000-f03fffff memory:e0000000-efffffff ioport:5000(size=64)

    Read the article

  • PyQt application architecture

    - by L. De Leo
    I'm trying to give a sound structure to a PyQt application that implements a card game. So far I have the following classes: Ui_Game: this describes the ui of course and is responsible of reacting to the events emitted by my CardWidget instances MainController: this is responsible for managing the whole application: setup and all the subsequent states of the application (like starting a new hand, displaying the notification of state changes on the ui or ending the game) GameEngine: this is a set of classes that implement the whole game logic Now, the way I concretely coded this in Python is the following: class CardWidget(QtGui.QLabel): def __init__(self, filename, *args, **kwargs): QtGui.QLabel.__init__(self, *args, **kwargs) self.setPixmap(QtGui.QPixmap(':/res/res/' + filename)) def mouseReleaseEvent(self, ev): self.emit(QtCore.SIGNAL('card_clicked'), self) class Ui_Game(QtGui.QWidget): def __init__(self, window, *args, **kwargs): QtGui.QWidget.__init__(self, *args, **kwargs) self.setupUi(window) self.controller = None def place_card(self, card): cards_on_table = self.played_cards.count() + 1 print cards_on_table if cards_on_table <= 2: self.played_cards.addWidget(card) if cards_on_table == 2: self.controller.play_hand() class MainController(object): def __init__(self): self.app = QtGui.QApplication(sys.argv) self.window = QtGui.QMainWindow() self.ui = Ui_Game(self.window) self.ui.controller = self self.game_setup() Is there a better way other than injecting the controller into the Ui_Game class in the Ui_Game.controller? Or am I totally off-road?

    Read the article

  • Weird Screen while booting to install, while installing and after the install...and then the "panic occured" error

    - by Klyn
    I've decided to install ubuntu but neither ubuntu or any other linux distro won't even get to the desktop screen or work after getting there. On windows 8, everything is just fine. my new video card works perfectly and I have no problem with anything about it. then when I try to boot from ubuntu with wubi or with usb everything goes like this: 1) Grub screen...no problem at all, colors are just fine everything looks okay 2) and then linux boot screen...weird background color, over the backround there are vertical stripes of red-orange dots. but on the ubuntu logo and text, there are no dots at all! -I mean its shape is perfect- 3) desktop is about the start but * vertical stripes of red colored dots are all over the unity screen*. then when I click on ubuntu's menu, it usually switches to black screen saying something about "panic occured"...and then it restarts or it gives no respond at all. problems started after putting hd 6570 video card on my asus m5a78lm-lx video card which has amd phenom II X4 processor on it. I've searched to find something but there was no similar question that's why I'm almost sure it is kind of unique. again, I'm writing on Windows 8 right now and everything works and looks perfect. so far I've updated bios and anyone knows anything to solve this?

    Read the article

  • Old Fglrx Driver - AMD Radeon HD 3200 - ubuntu won't start

    - by Yohannes
    I've been using Ubuntu 12.04 64 bit for about 2 weeks now and I installed the latest Fglrx driver (Graphics Card- AMD HD 3200, PC- Acer Aspire 5336, 4GB RAM, 500GB Harddrive). The problem is that sometimes video's lag and play out of sync sometimes the windows take long to show up after I've clicked them etc. After looking around I found a video on Youtube by Ubuntu help guy and in the video he recommended using an older driver if you have an older graphics card, his was about 4 years old (same as mine) and he used the 11.10 catalyst driver so I decided to try it. I removed the previous installation of the driver and then installed the 11.10 driver. However, when I restarted it instead of going to the GUI it goes to a terminal like window and asks for my login. Now its pretty clear I need to remove the old driver and go back to using the latest one. The only problem is I'm not sure where I saved the latest driver and in order to connect to the Internet I need to change /etc/resolv.conf (I use a static IP). So what should I do? Also anyone from personal experience, what propitiatory driver works best with my graphics card? As in the version. Thanks

    Read the article

  • How do I turn off PCI devices?

    - by ethana2
    With the purchase of an Intel SSD and 85WHr Li-ion battery and the linking of wifi and bluetooth to my laptop's wireless switch, extensive Intel PowerTop usage, switching from compiz to metacity, stopping of the desktop-couch daemon, removal of Ubuntu One and several other services from my startup, disabling of everything possible in my BIOS, and physical removal of my optical drive, I've gotten my battery life up fairly high, but I think there's still more to be done. Specifically, when I'm in class taking notes, I want to temporarily but completely power down: Ethernet Firewire USB ports SD card reader Optical drive Webcam Sound card PCMCIA slot ..without turning them off in my BIOS like they are now, if possible, because then I have to restart my computer to use any of them. As it stands, I still haven't managed to power down: Firewire USB connection to webcam sound card How do I tell Linux to disable and power down these devices? Is it true that any PCI slot can be physically powered down? My current idle power consumption is 7.9 watts plus the screen. (10.0W at min. brightness) Also, how do I set the screen timeout to ten seconds? gconf editor isn't honoring it when I set it to that. Will switching from nVidia to Nouveau save any significant amount of power?

    Read the article

< Previous Page | 140 141 142 143 144 145 146 147 148 149 150 151  | Next Page >