Search Results

Search found 18741 results on 750 pages for 'screen sharing'.

Page 205/750 | < Previous Page | 201 202 203 204 205 206 207 208 209 210 211 212  | Next Page >

  • Stencyl or flash limitation?

    - by FlightOfGrey
    I have found that Stencyl doesn't seem to be very good at handling very many actors in a game. I had been trying to implement some basic AI at the moment all the behaviours were: wander (pick a random point and move directly to it) and wrap around screen (if it goes off the top of the screen it appears at the bottom). Even with these simple behaviours and calculations the frame rate dropped dramatically when there were more then 50 actors on screen: 10 actors: 60fps 50 actors: 30-50fps 75 actors: ~30fps 100 actors: 15-20fps 200 actors: 8-10fps I had planned on having a maximum of around 200 actors but I see that's no longer an option and implementing a more complicated AI system with flocking behaviour, looking at creating a game in a similar vein to flOw. I understand that a game is perfectly playable at 30fps but this is a super simple test with ultra simple graphics, no animations and no sounds is child's play in terms of calculations to be made. Is this a limitation with Stencyl or is it flash? Should I simply scale the scope of the game down or is there a better game engine that exports to .swf that I should look into?

    Read the article

  • Resolution Independence in libGDX

    - by ashes999
    How do I make my libGDX game resolution/density independent? Is there a way to specify image sizes as "absolute" regardless of the underlying density? I'm making a very simple kids game; just a bunch of sprites displayed on-screen, and some text for menus (options menu primarily). What I want to know is: how do I make my sprites/fonts resolution independent? (I have wrapped them in my own classes to make things easier.) Since it's a simple kids game, I don't need to worry about the "playable area" of the game; I want to use as much of the screen space as possible. What I'm doing right now, which seems super incorrect, is to simply create images suitable for large resolutions, and then scale down (or rarely, up) to fit the screen size. This seems to work okay (in the desktop version), even with linear mapping on my textures, but the smaller resolutions look ugly. Also, this seems to fly in the face of Android's "device independent pixels" (DPs). Or maybe I'm missing something and libGDX already takes care of this somehow? What's the best way to tackle this? I found this link; is this a good way of solving the problem?: http://www.dandeliongamestudio.com/2011/09/12/android-fragmentation-density-independent-pixel-dip/ It mentions how to control the images, but it doesn't mention how to specify font/image sizes regardless of density.

    Read the article

  • Dumping a Linux console scrollback buffer?

    - by Gerald Combs
    We would like to save the output of a program run on a Linux console which spans many lines. Unfortunately it wasn't logged or run under screen, or any other way that lets us easily capture the output. The best method we've been able to come up with so far is: Log into the machine via a separate SSH session In the console session, page to the top of the buffer Repeat: In the SSH session, run "cat /dev/vcs >> screendump.txt" In the console session, page down one screen Dump the final screen in the SSH session Is there a better way? It seems like if the VC memory were contiguous and you knew where it was you could use dd to pull the console text directly out of kernel memory and into a file.

    Read the article

  • Run a startup script with lightdm

    - by cheshirekow
    I have a tablet PC and the graphics driver doesn't support xrandr, so in order to rotate the screen I run a script which changes the Xorg.conf file and then restarts lightdm. I also have a script which uses xsetwacom and xinput to change the rotation of the input devices so that the match the new orientation. I've learned how to get the script to run when I login, but I'd like it to run before I login, so that I don't have to enable auto-login with lightdm. I do need it to run though, or the input (touch and pen) is rotated with respect to the screen, so that when I touch the screen the input is in a completely different area, making it really difficult to use the onscreen keyboard. I've looked at other questions on this site. I've tried putting my script in /etc/Xsession.d but that didn't seem to work. I also tried putting it in /etc/rc.local but I think that is the wrong place, nothing seems to happen. I've also tried googling for lightm script hooks, and various other google terms. Any suggestions? Edit 1: After doing some research, it seems to me that it might not be that I want to run a script with lightdm, but rather with the lighdm greeter (in this case, I think the unity-greeter?). Are there any script-hooks for the unity-greeter?

    Read the article

  • Wrong PC-projector resolution in Windows 7

    - by peter.olsson
    I'm connecting a PC-projector (Benq MP721) to a Windows 7 Professional laptop (HP 6730b). All the output settings on the laptop, including the laptop screen, changes to 1024x768 (which the projector supports). However the projector says it receives 1360x768 and asks me to change the resolution to 1024x768. I'm using mirrored display. The laptop is 1024x768 The screen resolution in the control panel says 1024x768 The Intel graphics card utilities says 1024x768 The driver for the projector is a Generic PnP Monitor Is there anything in Windows 7 that would convert my 4:3 resolution to wide screen automatically?

    Read the article

  • Flash AS3 sidescrolling tiles optimization

    - by Galvanize
    I'm trying to make a sidescrolling game in Flash that will run on a low performance laptop. While studying the subject from Tonypa I saw that he builds a Bitmap by making copys of the BitmapData of each tile from the Tile Sheet and placing it on the bigger Bitmat with the size of the screen. But when I came to think on how to scroll my map I ran into some optimization doubts. I came up with two choices: Create a MovieClip, place a Bitmap instance for each tile that is shown on the screen + 1 row in it, then move them all. Then when the tile ran off the screen I would move it to end of the MovieClip and replace their BitmapData for the next row in my map. Use a Bitmap with copys of each tile in it (as shown in Tonypa's tutorial) but 1 extra row, move the whole Bitmap, and when it comes the time to replace rows, redraw the whole Bitmap and move it back to the origin position. The first idea is how a co-worker of mine suggested, the second one is my own, but none of us has enough technical knowledge to be sure on a technique that would be optimal in performance, can anyone help?

    Read the article

  • Mini DisplayPort -> DVI -> VGA ?

    - by ibz
    I have a Mini DisplayPort to DVI adapter which I use to connect my MacBook to a screen that has DVI input. I also have a screen with VGA only input which I want to be able to use, so instead of buying yet another expensive Apple adapter (the Mini DisplayPort to VGA), I just got a cheap DVI to VGA, so I can do Mini DisplayPort - DVI - VGA. It doesn't seem to work though. The screen just says "no connection". Does anyone know if this is actually supposed to work (and my DVI - VGA is just broken), or is this simply not supported and I need to get the expensive Apple Mini DisplayPort to VGA? Thanks!

    Read the article

  • Windows XP to Ubuntu 10.04 via VNC does not refresh

    - by hughdbrown
    I've tried tightvncserver and vnc4server on Ubuntu. I've tried tightvnc viewer and ultravnc viewer on Windows XP. I can connect from Windows to Ubuntu with any combination, but there is no screen refresh: I can drag a window on Ubuntu using my mouse in Windows or type into a terminal in Ubuntu from my keyboard in Windows, but the image does not change on Windows. I can request a screen refresh from Windows but the screen does not update. I am running the ATI driver on Ubuntu. I've tried stepping the System|Preferences|Appearance|Visual Effects down from Extra to Normal with no effect.

    Read the article

  • Causes of hard crashes on Windows XP and how to debug

    - by Sam Brightman
    I am occasionally seeing hard lockups on XP: totally unresponsive to keyboard/mouse, screen freezes at time of crash, no SSH/VNC possible. Very intermittent, nothing in the logs. I never see a blue screen on any kind of error message. This morning I logged in via VNC, logged out again, 20 minutes later physically sat at PC and it crashed around the time of VNC logout. I tend to suspect video cards in these kind of situations but it's a modern-ish card with modern drivers (one revision back, but this has been happening for 5 revisions or more) and normally would at least see a blue screen I expect. What would you suspect? Where can I look or what can I set up for more information? Bear in mind that this happens about once every 3 or 4 weeks, so extensive logging or intrusive monitoring isn't really an option.

    Read the article

  • Architecture advice for converting biz app from old school to new school?

    - by Aaron Anodide
    I've got a WinForms business application that evolved over the past few years. It's forms over data with a number custom UI experiences taylored to the business, so I don't think it's a candidate to port to something like SharePoint or re-write in LightSwitch (at least not without significant investment). When I started it in 2009 I was new to this type of development (coming from more low level programming and my RDBMS knowledge was just slightly greater than what I got from school). Thus, when I was confronted with a business model that operates on a strict monthly accounting cycle, I made the unfortunate decision to create a separate database for each accounting period. Also, when I started I knew DataSets, then I learned Linq2Sql, then I learned EntityFramework. The screens are a mix and match of those. Now, after a few years developing this thing by myself I've finally got a small team. Ultimately, I want a web front end (for remote access to more straight up screens with grids of data) and a thick client (for the highly customized interfaces). My question is: can you offer me some broad strokes architecture advice that will help me formulate a battle plan to convert over to a single database and lay the foundations for my future goals at the same time? Here's a screen shot showing how an older screen uses DataSets and a newer screen uses EF (I'm thinking this might make it more real for someone reading the question - I'm willing to add any amount of detail if someone is willing to help).

    Read the article

  • Ubuntu Nvidia Xorg Twinview doesnt like my monitors

    - by Andrew Bolster
    Basically, using the latest available ubuntu drivers (195.36.15) I cannot for the life of me get my two monitors to operate at suitable resolutions. When not using the drivers atall and going single-screen, both monitors support 1680x1050, but this option is only shown for one monitor in nvidia-settings, and when i manually add a metamode into the xorg.conf, it just gives up initialising the second screen. (**) Mar 25 15:49:47 NVIDIA(0): TwinView enabled (II) Mar 25 15:49:47 NVIDIA(0): Assigned Display Devices: CRT-0, CRT-1 (II) Mar 25 15:49:47 NVIDIA(0): Validated modes: (II) Mar 25 15:49:47 NVIDIA(0): "1680x1050,1680x1050" (II) Mar 25 15:49:47 NVIDIA(0): Virtual screen size determined to be 1680 x 1050 Any ideas?

    Read the article

  • Ubuntu Nvidia Xorg Twinview doesnt like my monitors

    - by Andrew Bolster
    Basically, using the latest available ubuntu drivers (195.36.15) I cannot for the life of me get my two monitors to operate at suitable resolutions. When not using the drivers atall and going single-screen, both monitors support 1680x1050, but this option is only shown for one monitor in nvidia-settings, and when i manually add a metamode into the xorg.conf, it just gives up initialising the second screen. (**) Mar 25 15:49:47 NVIDIA(0): TwinView enabled (II) Mar 25 15:49:47 NVIDIA(0): Assigned Display Devices: CRT-0, CRT-1 (II) Mar 25 15:49:47 NVIDIA(0): Validated modes: (II) Mar 25 15:49:47 NVIDIA(0): "1680x1050,1680x1050" (II) Mar 25 15:49:47 NVIDIA(0): Virtual screen size determined to be 1680 x 1050 Any ideas?

    Read the article

  • xorg, nvidia, log-in all hosed - how can I completely reset graphics set-up/settings?

    - by Fred Hamilton
    I just did a fresh install of Mythbuntu 12.04.1 on my Intel MB with nVidia 9500GT graphics card. Hardware's been working great with 10.10 for about 2 years. Background: (optional - feel free to skip to question) I was trying to get my component video output to generate 720p, messing around with the nvidia drivers, and now the entire display system is hosed. I can SSH in and get a terminal. Depending on which nvidia package I install/remove, I get: Garbage on screen (after I "apt-get remove nvidia*") A low-res graphical log-in screen where I can log in as fred or guest. If I log in as fred, it displays some text mode status line then goes right back to the log-in screen. If I log in as guest, I actually get the full Ubuntu desktop, but I need to be able to log in as fred. Other times I get an error: "API mismatch: the NVIDIA kernel module has version 304.43, but this NVIDIA driver component has version 295.49." I've googled around, including trying this thread with the same error message, but to no effect. Question: How can I just reset x settings, drivers, everything display-related to the exact same way it was after a fresh install?

    Read the article

  • External monitor is blank if I boot with the monitor plugged in

    - by Ronald
    Ubuntu 12.04 has problem with Intel GM45 Chipset, featuring the the Mobile Intel® Graphics Media Accelerator (GMA) 4500MHD I have a COMPAQ Presario CQ70 laptop with an Intel GM45 chipset that features the Mobile Intel® Graphics Media Accelerator (GMA) 4500MHD. I was using the second HDMI video port to drive both a projector or a second monitor. Everything was working fine under Ubuntu 8.04, 9.04, 10.04 and 11.04, however, when I upgraded to 12.04 the second monitor stopped working. What I mean when I say stopped working is: boot with monitor plugged in. Blank screen! Power off, unplug monitor and power on everything works. Plug in monitor (only mode that works in Mirror mode) two monitors that look same. Close laptop lid. screen goes blank only option for useful system is power off and unplug monitor. If I attempt to Adjust the monitor to maximum resolution that the monitor will handle and turn off mirror mode nothing can be moved onto that screen. This all worked fine with earlier versions of Ubuntu, is there a notes about the changes to the graphics management system in 12.04, like there is for the resolver change?

    Read the article

  • TightVNC (or any VNC) viewer windows scaling

    - by mr.b
    Hi, I am currently using TightVNC to connect to multiple remote hosts in LAN. I start 16 VNC viewers, set Scaling by: Auto (in connection options display), and then select all viewer windows and use Tile Horizontally, which covers my entire screen with VNC viewers. It all works sort of nice, except that desktop interaction is really slow when there are more then 4 VNC viewers. My question is, does VNC client (not just TightVNC, but any compatible client) support some kind of smart scaling option, so that client tells server something along the lines of: "Okay, I'm displaying your entire screen in a window size 300x225 px, so can you please start sending encoded images on that resolution?", at which point interactiveness of open connections dramatically increase, and when I decide to go full screen on some connection, client and server re-negotiate and server starts sending full resolution images again? Thanks!!

    Read the article

  • Startup/Shutdown time in Xubuntu is increasing!

    - by Ankit
    I am a novice Xubuntu user on a dual-boot machine. The other OS I have is Windows 7. When I first began using Xubuntu, I had really fast startup and shutdown (much much faster than Windows 7 :) ). However, as I started using it more and more for my work, these times started rising. I do not have any problems with execution speed of running applications. My main concern is the shutdown time. Now it has gone above Windows shutdown time [startup time has only partially increase compared to shutdown]. I checked some similar questions like this. However, they seem to not answer my concern as I feel that the concerned users there experience a long wait before the screen goes blue. In my case, the screen goes blue (desktop session ends a blue screen with a moving slider appears) pretty fast. However, it remains blue for a long time. Another answer that I saw on google was to use dmesg and then stopping some services that I do not want. However, me being a novice could not completely understand what it meant

    Read the article

  • Could not apply the stored configuration for the monitor

    - by dellphi
    I'm using Nvidia 7300 gt and monitor-Acer V173w, on 64 bit Ubuntu 10.04. Compiz and Emerald went well, but at the time of entry into the GUI, I always receive the message : "Could not apply the stored configuration for the monitor, could not find a suitable configuration of screens" Why do I always receive it, and what is wrong with the monitor configuration or pci-e is used? root@dellph1-desktop:/# xrandr Screen 0: minimum 320 x 240, current 1440 x 900, maximum 1440 x 900 default connected 1440x900+0+0 0mm x 0mm 1440x900 50.0* 1024x768 51.0 58.0 59.0 1360x768 52.0 53.0 1152x864 54.0 55.0 56.0 57.0 960x600 60.0 960x540 61.0 896x672 62.0 840x525 63.0 64.0 65.0 66.0 832x624 67.0 800x600 68.0 69.0 70.0 71.0 72.0 73.0 800x512 74.0 720x450 75.0 680x384 76.0 77.0 640x512 78.0 79.0 640x480 80.0 81.0 82.0 83.0 576x432 84.0 85.0 86.0 87.0 512x384 88.0 89.0 90.0 416x312 91.0 400x300 92.0 93.0 94.0 95.0 320x240 96.0 97.0 98.0 root@dellph1-desktop:/# === xorg.conf # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 260.19.29 ([email protected]) Wed Dec 8 12:27:27 PST 2010 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: xconfig, VertRefresh source: xconfig Identifier "Monitor0" VendorName "Unknown" ModelName "Acer V173W" HorizSync 30.0 - 83.0 VertRefresh 55.0 - 75.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 7300 GT" EndSection Section "Screen" # Removed Option "metamodes" " 1440x900_60 +0+0; 1280x1024 +0+0" # Removed Option "metamodes" "1440x900 +0+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" Option "TwinViewXineramaInfoOrder" "CRT-0" Option "metamodes" "1440x900_75 +0+0; 1440x900 +0+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • Macbook Pro Multiple Monitor Problem

    - by thinksocrates
    I have been using a Macbook pro (newest model) for about 4 weeks with dual monitors. It has been working great using the mac adaptor to dvi. Today however, my mac will not recognize it's built in monitor while the second monitor is plugged in. Plug in second monitor The screen on the laptop goes dark. The second monitor acts as the main screen. Click "Detect displays". Nothing happens Unplug the second monitor Screen on the laptop comes on. Any thoughts?

    Read the article

  • Ubuntu installation on iMac

    - by Shanew
    I have an iMac configured as follows - Screen 27" CPU 3.4 GHz i7 Graphics AMD Radeon HD 6970 1024 MB I downloaded Ubuntu version 11.10 64 bit ISO and burnt that to both DVD and USB stick as per the instructions on Ubuntu's download page. Neither will boot. Symptoms are as follows - DVD: When the iMac is restarted and booted from DVD (labelled Windows which isn't mentioned in Ubuntu's website instructions) one line is displayed against a black screen displaying a message about the developer and date. After 5 minutes the message hangs and the DVD ceases to spin. USB Stick: Strangely I have to select the EFI boot CD icon which appears after holding down the Alt key. A text menu appears offering me to try Ubuntu without installing. I select this and the screen goes blank and stays blank. Any ideas? Lastly, after writing Ubuntu to DVD and USB stick, neither could be read by OSX making the instructions to eject them as per Ubuntu website's instructions useless. This might help? Thanks, Shane.

    Read the article

  • Gnome 3 - Multiple Video Cards - Xinerama -- Forced Fallback Mode

    - by Alvin
    Just installed a 2nd nvidia video card -- previously had gnome 3 working perfectly with 2 monitors on a a single video card using twinview tried a number of things thus far twinview on 1 card + xinerama no xinerama no twinview various manual xorg.conf hacks based on random forums (couple references below) xinerama no twinview with and without Extensions Composite The last one is what I'm using now -- it results in a forced fallback mode with Composite Disable set at the end of xorg.conf via nvidia-settings Section "Extensions" Option "Composite" "Disable" EndSection when I disabled that last snippet it boots to gnome 3 full with the left monitor on a black screen and the middle monitor as primary but non-responsive switching to console mode Ctrl+Alt+F1 and then switching back I get 3 black screens with a mouse that can move around but nothing to interact with issue seems related to OpenGL and the multiple video cards -- I can boot into Unity without issue though my Glx-Dock shows up with the black background as barely shows in the screenshot below indicating the OpenGL is not initiated has anyone had any luck with getting Xinerama to work with Multiple NVidia Video Cards with OpenGL support? Found this in the logs while looking a bit further [ 23.208] (II) NVIDIA(1): Setting mode "nvidia-auto-select+0+0" [ 23.254] (WW) NVIDIA(1): The GPU driving screen 1 is incompatible with the rest of the [ 23.254] (WW) NVIDIA(1): GPUs composing the desktop. OpenGL rendering will be [ 23.254] (WW) NVIDIA(1): disabled on screen 1. [ 23.277] (==) NVIDIA(1): Disabling shared memory pixmaps [ 23.277] (==) NVIDIA(1): Backing store disabled [ 23.277] (==) NVIDIA(1): Silken mouse enabled [ 23.277] (==) NVIDIA(1): DPMS enabled According to this page at the NVidia User Docs http://us.download.nvidia.com/XFree86/Linux-x86/173.14.09/README/chapter-14.html I may be out of luck =( Starting this question with the hopes that others may be able to help debug and perhaps gain answers over time as I really want to get the full gnome 3 back.

    Read the article

  • Hot video card in server

    - by DougN
    Not sure if this belongs here or Superuser (I looked at Superuser -- suspect there are more hardware gurus here). I have a server that sits in a cabinet. It's connected to a small screen that is normally off. However, the video card is running at about 210 F all the time. The rest of the PC is pretty cool (getting temps from SpeedFan). Any thoughts on a way to quiet/calm/cool the video card since it's never really doing anything anyway? I'm usually logged out on the server, and no screen saver defined. Windows is already set to turn off the screen for power saving at 5 minutes.

    Read the article

  • libgdx game not disposing

    - by Yesh
    My game does not exit entirely even after calling dispose() method. It loads a black screen when I launch it for the second time and works well if I kill the game manually and restart it. I get an error that says buffer not allocated with newUnsafeByteBuffer or already disposed when I try to dispose off the SpriteBatch object. This is were I suspect the problem to be. But not able to fix it entirely. Please help! Here is how I have built it (I have put the sample code here just to show you guys that there are no visible loop backs in dispose function, please correct me if I'm wrong)- In game screen, public void dispose() { AssetLoader.dispose(); render.dispose(); Gdx.app.exit(); } Under class AssetLoader- public void dispose(){ Texture.dispose(); sound.dispose(); } Under game render class - public void dispose(){ spritebatch.dispose(); //throws an error when I GameScreen.dispose is called font.dispose(); shaperender.dispose(); } I believe that my spritebatch isn't disposing which is causing the black screen but I cannot find a way to dispose it off successfully. Any help would be greatly appreciated.

    Read the article

  • Using gluLookAt to move camera in 2D iPhone game ?

    - by Mr.Gando
    Hey guys, I'm trying to use gluLookAt to move the camera in my iPhone game, but every time I've tried to use gluLookAt my screen just goes "blank" ( grey in this case ) I'm trying to render a simple triangle and to move the camera, this is my code: to setup my scene I do: glViewport(0, 0, backingWidth, backingHeight); glMatrixMode(GL_PROJECTION); glLoadIdentity(); glRotatef(-90.0, 0.0, 0.0, 1.0); //using iPhone in horizontal mode glOrthof(-240, 240, -160, 160, -1, 1); glMatrixMode(GL_MODELVIEW); then my "triangle rendering" code looks like: GLfloat triangle[] = {0, 100, 100, 0, -100, 0,}; glClearColor(0.7, 0.7, 0.7, 1.0); glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glEnableClientState(GL_VERTEX_ARRAY); glColor4f(1.0, 0.0, 0.0, 1.0); glVertexPointer(2, GL_FLOAT, 0, &triangle); glDrawArrays(GL_TRIANGLES, 0, 6); glDisableClientState(GL_VERTEX_ARRAY); This draws a red triangle in the middle of the screen, when I try to apply gluLookAt ( I got the implementation of the function from Cocos2D so I asume it's correct ), i do: glMatrixMode(GL_MODELVIEW); glLoadIdentity(); gluLookAt(0,0,1,0,0,0,0,0,1); // try to move the camera a bit ? GLfloat triangle[] = {0, 100, 100, 0, -100, 0,}; glClearColor(0.7, 0.7, 0.7, 1.0); glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glEnableClientState(GL_VERTEX_ARRAY); glColor4f(1.0, 0.0, 0.0, 1.0); glVertexPointer(2, GL_FLOAT, 0, &triangle); glDrawArrays(GL_TRIANGLES, 0, 6); glDisableClientState(GL_VERTEX_ARRAY); This leads me to grey screen (glClearColor is grey), I've tried all sort of things and read what I've found about gluLookAt on the net, but no luck :(, if someone could explain me or show me how to move to move the camera in a top-down fashion ( zelda, etc ), I would really appreciate it. Thanks!

    Read the article

  • linux display drivers

    - by salman
    I've run into a major display problem on newly installed fedora 11, on my 6 years old pc which runs a pentium4 2.4 GHz processor, 1 gb ddr ram, intel 845 motherboard with integrated graphics card. When i open an image or play a video, my complete screen turns garbled. I simply cannot make out whats on my screen. With difficulty i have to close the image/video window and move around the folder window to clean the screen image. Is it because of my display drivers? How can i fix it? I also ran into mp3 plugins and flash issues which i was able to resovle. I'm new to linux, the sole purpose of isntalling it on my old pc was to learn linux but this display problem is frustrating me. Thanks, Salman

    Read the article

  • How can I cull non-visible isometric tiles?

    - by james
    I have a problem which I am struggling to solve. I have a large map of around 100x100 tiles which form an isometric map. The user is able to move the map around by dragging the mouse. I am trying to optimize my game only to draw the visible tiles. So far my code is like this. It appears to be ok in the x direction, but as soon as one tile goes completely above the top of the screen, the entire column disappears. I am not sure how to detect that all of the tiles in a particular column are outside the visible region. double maxTilesX = widthOfScreen/ halfTileWidth + 4; double maxTilesY = heightOfScreen/ halfTileHeight + 4; int rowStart = Math.max(0,( -xOffset / halfTileWidth)) ; int colStart = Math.max(0,( -yOffset / halfTileHeight)); rowEnd = (int) Math.min(mapSize, rowStart + maxTilesX); colEnd = (int) Math.min(mapSize, colStart + maxTilesY); EDIT - I think I have solved my problem, but perhaps not in a very efficient way. I have taken the center of the screen coordinates, determined which tile this corresponds to by converting the coordinates into cartesian format. I then update the entire box around the screen.

    Read the article

< Previous Page | 201 202 203 204 205 206 207 208 209 210 211 212  | Next Page >