Search Results

Search found 8165 results on 327 pages for '3d graphics'.

Page 8/327 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • Best Graphics card (Setup) for three, high-res monitors attached to desktop

    - by nomrasco
    I have been looking around a little bit, but most of the discussions are about problems with already existing systems (particular graphics card and setup etc.). I would like to ask what would be the best option for me if I want to build powerful desktop with triple monitor setup. I have one Dell UltraSharp U2713HM (27 inches, 2560x1440) and I was thinking about getting two more. Would it be possible to have those three working with ubuntu (kubuntu) on any graphics card out there today? What is the best option if it comes to choosing particular model? Should I use proprietary drivers or some open sourced ones? I am not a gamer. I mostly develop on my machines and running some computational tasks, but I would rather like to spend some more money and have setup where I don't see any lags :) thank You very much in advance! Darek

    Read the article

  • Level of Detail for 3D terrains/models in Mobile Devices (Android / XNA )

    - by afriza
    I am planning to develop for WP7 and Android. What is the better way to display (and traverse) 3D scene/models in term of LoD? The data is planned to be island-wide (Singapore). 1) Real-Time Dynamic Level of Detail Terrain Rendering 2) Discrete LoD 3) Others? And please advice some considerations/algorithms/resources/source codes. something like LoD book also Okay. Side note: I am a beginner in this area but pretty well-versed in C/C++. And I haven't read the LoD book. Related posts: - Distant 3D object rendering [games]

    Read the article

  • Dynamically generate Triangle Lists for a Complex 3D Mesh

    - by Vulcan Eager
    In my application, I have the shape and dimensions of a complex 3D solid (say a Cylinder Block) taken from user input. I need to construct vertex and index buffers for it. Since the dimensions are taken from user input, I cannot user Blender or 3D Max to manually create my model. What is the textbook method to dynamically generate such a mesh? Edit: I am looking for something that will generate the triangles given the vertices, edges and holes. Something like TetGen. As for TetGen itself, I have no way of excluding the triangles which fall on the interior of the solid/mesh.

    Read the article

  • How to create art assets for a 3d avatar editor

    - by Andrew Garrison
    I am currently prototyping an idea for an iPhone game. I'd like to create an avatar editor inside the game so that the player can create a 3d avatar face and modify certain features (using slider controls), such as nose shape, eye color, mouth size, etc. This has been done in several games, but what I'm looking to do would be fairly cartoon-ish/caricature-ish, similar to the Mii editor on the Nintendo Wii (http://www.myavatareditor.com/). I'd also like the final result to have the ability to use some canned animations, such as simple speech animations, smiling, frowning, etc. I am not an artist, so I would be unable to create these assets, but what kind of effort is required for an artist to create the 3d models necessary for this type of game? Also what mechanism would be required to tweak the face's characteristics? Would you use bones or morph targets? How would the final result be animated? Would facial animation use bones or morph targets? I've seen several tools that do this sort of thing too, such as FacialStudio. Are there any facial generation tools out there you'd recommend for generating some base content for this game, or should I just hire an artist to do this type of work. Thanks!

    Read the article

  • Keystone Correction using 3D-Points of Kinect

    - by philllies
    With XNA, I am displaying a simple rectangle which is projected onto the floor. The projector can be placed at an arbitrary position. Obviously, the projected rectangle gets distorted according to the projectors position and angle. A Kinect scans the floor looking for the four corners. Now my goal is to transform the original rectangle such that the projection is no longer distorted by basically pre-warping the rectangle. My first approach was to do everything in 2D: First compute a perspective transformation (using OpenCV's warpPerspective()) from the scanned points to the internal rectangle's points und apply the inverse to the rectangle. This seemed to work but was too slow as it couldn't be rendered on the GPU. The second approach was to do everything in 3D in order to use XNA's rendering features. First, I would display a plane, scan its corners with Kinect and map the received 3D-Points to the original plane. Theoretically, I could apply the inverse of the perspective transformation to the plane, as I did in the 2D-approach. However, in since XNA works with a view and projection matrix, I can't just call a function such as warpPerspective() and get the desired result. I would need to compute the new parameters for the camera's view and projection matrix. Question: Is it possible to compute these parameters and split them into two matrices (view and projection)? If not, is there another approach I could use?

    Read the article

  • Strange 3D game engine camera with X,Y,Zoom instead of X,Y,Z

    - by Jenko
    I'm using a 3D game engine, that uses a 4x4 matrix to modify the camera projection, in this format: r r r x r r r y r r r z - - - zoom Strangely though, the camera does not respond to the Z translation parameter, and so you're forced to use X, Y, Zoom to move the camera around. Technically this is plausible for isometric-style games such as Age Of Empires III. But this is a 3D engine, and so why would they have designed the camera to ignore Z and respond only to zoom? Am I missing something here? I've tried every method of setting the camera and it really seems to ignore Z. So currently I have to resort to moving the main object in the scene graph instead of moving the camera in relation to the objects. My question: Do you have any idea why the engine would use such a scheme? Is it common? Why? Or does it seem like I'm missing something and the SetProjection(Matrix) function is broken and somehow ignores the Z translation in the matrix? (unlikely, but possible) Anyhow, what are the workarounds? Is moving objects around the only way? Edit: I'm sorry I cannot reveal much about the engine because we're in a binding contract. It's a locally developed engine (Australia) written in managed C# used for data visualizations. Edit: The default mode of the engine is orthographic, although I've switched it into perspective mode. Its probably more effective to use X, Y, Zoom in orthographic mode, but I need to use perspective mode to render everyday objects as well.

    Read the article

  • Unity 3D does not work on Dell system with a AMD Radeon HD 6470M

    - by VeeKay
    I am running 64 bit Ubuntu on Dell with 1GB graphic card. I login with "Ubuntu" hoping to see Unity 3d but it doesn't. Unity 2D runs instead. when I type in echo "$DESKTOP_SESSION" it confirms the Unity-2D. I've checked the System info that shows like : The graphics row shows itself as empty. SO I've presumed that the graphic drivers aren't detected and hence I went to Unity- Additional Drivers and installed the fglrx driver that the UI has suggested. Even after installing so, the graphics part in System info details shows nothing and still Unity 2D runs in spite of all the effort. Please help! How can I get my Unity 3D back? Hardware Info Video Card : AMD Radeon™ HD 6470M - 1GB (For ICC) RAM : 6GB (1 X 2GB + 1 X 4GB) 2 DIMM DDR3 1333Mhz OS : 64 bit Ubuntu 11.10 Edit : Output for /usr/lib/nux/unity_support_test -p X Error of failed request: BadRequest (invalid request code or no such operation) Major opcode of failed request: 155 (GLX) Minor opcode of failed request: 19 (X_GLXQueryServerString) Serial number of failed request: 21 Current serial number in output stream: 21

    Read the article

  • Boot ubuntu 12.04 in 3D - nomodeset quiet splash install

    - by rahi
    I would like to enable 3D in Ubuntu 12.04. I recently tried to install ubuntu on a new computer. When the installation was complete and I rebooted the machine, I could only see a blank screen. After some searching, I followed this tutorial which instructed me to boot with "nomodeset" enabled. I choose this on the USB I was installing ubuntu 12.04 from. Fortunately, the ubuntu installation on the new computer was successful. When I tried to change the size of the unity launcher icons, I did not see that option (as I see on my other computer running ubuntu 12.04). I tried installing MyUnity and it told me that the computer I had just installed 12.04 to was running in 2D. To my knowledge. all the software is up to date (as I ran the Software Updater). In addition, when I look for Additional Drivers, I see a message that says "No proprietary drivers are in use on this system". When I look under System Details Graphics, I see the Driver as "VESA:Intel Sandybridge/Ivybridge Graphics. When I hold the shift key on when my machine boots up, and type "e" on the Grub menu, I see the following towards the end, "nomodeset quiet splash $vt_handoff". Does this have anything to do with the plain 2D ubuntu 12.04 experience? Again, what I'd like to do now is get the 3D experience on my new machine running 12.04. Please let me know if you need more information.

    Read the article

  • 3D space game development

    - by user1693061
    I want to develop a 3D game (sci-fi type with spaceships) which can be played on multiplayer mode and by multiplayer i mean around 10 players for start as it will be a personal testing project and mostly educational. I have been searching for some days about the available languages and engines but i am kinda confused. Since i have been learning Java for my 1st year in I.T university and i have pretty good understanding i thought i would go with the Java language and develop that game on an applet so it could be played on a browser. After going through an applet game tutorial i understood how graphics work on an applet. So.. 1st question: Could an applet carry the burden of a 3D game especially on multiplayer? My thinking: It's a space game so the graphics should not be such a big problem since it wont be that crowded with entities apart from ships, planets and some effects. If the java applet is not the way for my project i would't mind "developing it on desktop"(i mean not making it a browser game). 2nd question: Should i use Unity engine for my purpose(space game)? If not name other language/engine combo.

    Read the article

  • 3d Picking under reticle

    - by Wolftousen
    i'm currently trying to work out some 3d picking code that I started years ago, but then lost interested the assignment was completed (this part wasn't actually part of the assignment). I am not using the mouse coords for picking, i'm just using the position in 3d space and a ray directly out from there. A small hitch though is that I want to use a cone and not a ray. Here are the variables i'm using: float iReticleSlope = 95/3000; //inverse reticle slope float baseReticle = 1; //radius of the reticle at z = 0 float maxRange = 3000; //max range to target Quaternion orientation; //the cameras orientation Vector3d position; //the cameras position Then I loop through each object in the world: Vector3d transformed; //object position after transformations float d, r; //holder variables for(i = 0; i < objects.length; i++) { transformed = position - objects[i].position; //transform the position relative to camera orientation.multiply(transformed); //orient the object relative to the camera if(transformed.z < 0) { d = sqrt(transformed[0] * transformed[0] + transformed[1] * transformed[1]); r = -transformed[2] * iReticleSlope + objects[i].radius; if(d < r && -transformed[2] - objects[i].radius <= maxRange) { //the object is under the reticle } else { //the object is not under the reticle } } else { //the object is not under the reticle } } Now this all works fine and dandy until the window ratio doesn't match the resolution ratio. Is there any simple way to account for that

    Read the article

  • Checking for collisions on a 3D heightmap

    - by Piku
    I have a 3D heightmap drawn using OpenGL (which isn't important). It's represented by a 2D array of height data. To draw this I go through the array using each point as a vertex. Three vertices are wound together to form a triangle, two triangles to make a quad. To stop the whole mesh being tiny I scale this by a certain amount called 'gridsize'. This produces a fairly nice and lumpy, angular terrain kind of similar to something you'd see in old Atari/Amiga or DOS '3D' games (think Virus/Zarch on the Atari ST). I'm now trying to work out how to do collision with the terrain, testing to see if the player is about to collide with a piece of scenery sticking upwards or fall into a hole. At the moment I am simply dividing the player's co-ordinates by the gridsize to find which vertex the player is on top of and it works well when the player is exactly over the corner of a triangle piece of terrain. However... How can I make it more accurate for the bits between the vertices? I get confused since they don't exist in my heightmap data, they're a product of the GPU trying to draw a triangle between three points. I can calculate the height of the point closest to the player, but not the space between them. I.e if the player is hovering over the centre of one of these 'quads', rather than over the corner vertex of one, how do I work out the height of the terrain below them? Later on I may want the player to slide down the slopes in the terrain.

    Read the article

  • Graphics hardware warning when updating to 14.04

    - by pacomet
    As I use Ubuntu at work I just update only LTS versions but now I'm not sure if I can/should. As my computer is now ten years old I would change if was mine but as it is owned by my employer I have to work with it. It's not a bad one, it runs fine (this was not true when still had Windows on it ;-). When updating to 14.04, it warns about possible bad/slow performance with Unity 3D so I stop updating as I am at work, not my own computer. As I understand from http://askubuntu.com/a/438958/25305 Nvidia Geforce FX 5500 graphics card is still supported in 14.04. Now, in 12.04, I have driver version 173 and unity 2d runs fine for me. output of /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce FX 5500/AGP/SSE2 OpenGL version string: 2.1.2 NVIDIA 173.14.39 Not software rendered: yes Not blacklisted: no GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no Should I update? Is it better to stay with 12.04? Thanks

    Read the article

  • Ubuntu 3D does not work on Dell system with a AMD Radeon HD 6470M

    - by VeeKay
    I am running 64 bit Ubuntu on Dell with 1GB graphic card. I login with "Ubuntu" hoping to see Unity 3d but it doesn't. Unity 2D runs instead. when I type in echo "$DESKTOP_SESSION" it confirms the Unity-2D. I've checked the System info that shows like : The graphics row shows itself as empty. SO I've presumed that the graphic drivers aren't detected and hence I went to Unity- Additional Drivers and installed the fglrx driver that the UI has suggested. Even after installing so, the graphics part in System info details shows nothing and still Unity 2D runs in spite of all the effort. Please help! How can I get my Unity 3D back? Hardware Info Video Card : AMD Radeon™ HD 6470M - 1GB (For ICC) RAM : 6GB (1 X 2GB + 1 X 4GB) 2 DIMM DDR3 1333Mhz OS : 64 bit Ubuntu 11.10 Edit : Output for /usr/lib/nux/unity_support_test -p X Error of failed request: BadRequest (invalid request code or no such operation) Major opcode of failed request: 155 (GLX) Minor opcode of failed request: 19 (X_GLXQueryServerString) Serial number of failed request: 21 Current serial number in output stream: 21

    Read the article

  • Has anyboy been able to install Pantograph for Blender on Windows?

    - by S.gfx
    I am specially interested on if somebody is actually doing/maintaining an installer for Windows, as there are quite several issues when installing all dependencies, etc. (for example, there might be someone already doing a Installbuilder installer for it and am just not aware of the matter) If not, at least if someone got it working and has some key tip to share. I am never able to fully get it all up and running. I'd love to have a not super complex way to install each new build of this great vector rendering module for Blender. Edit- Pantograph url: http://severnclaystudio.wordpress.com/bluebeard/a-users-guide-to-pantograph/

    Read the article

  • Ubuntu and bumblebee and intel problem

    - by LnxSlck
    I have a brain teaser that i can't solve. When 12.04 (64bits) came out, i did a fresh install and then installed bumblebee with: sudo add-apt-repository ppa:bumblebee/stable sudo apt-get install bumblebee bumblebee-nvidia And Ubuntu recognized (System Settings - Details) Intel Card for primary card, and i could launch applications with optirun. I have: 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation GF119 [GeForce GT 520MX] (rev ff) Now, a few days back, i did the exact same thing, installed from the same DVD everything the same, installed bumblebee the same way. Nvidia with Optirun works fine, but Ubuntu doesn't have 3D effects: root@deathstar:~# optirun /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce GT 520MX/PCIe/SSE2 OpenGL version string: 4.2.0 NVIDIA 295.40 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: no GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no I never did anything to install Intel drivers (except installing mesa-utils), before bumblebee got everything working, but now i can't get Ubuntu to get Unity with 3D. Can someone help me please get Unity 3d working? Your help will be much appreciated

    Read the article

  • Failed to get i915 symbols, graphics turbo disabled

    - by Optimus Prime
    I'm getting "Failed to get i915 symbols, graphics turbo disabled" error message after installing following softwares and few updates from Ubuntu. Django, Mysql server 5.5 Mysql benchmark And i have installed few updates for ubuntu. It was showing as Security Updates for Ubuntu. After installing Updates the update manager showed that i should restart the system. On restart i got following error message. " failed to get i915 symbols, graphics turbo disabled". So i tried the work around mentioned here (using the Live CD) ie add intel_ips to the blacklist echo "blacklist intel_ips" /etc/modprobe.d/blacklist.conf add i915 and intel_ips to /etc/modules echo -e "i915/nintel_ips" /etc/modules Now when start the system it freezes at Ubuntu splash screen. I'm using Ubuntu 12.04 LTS, on Dell inspiron N1040. I need to boot the system as i have spend lot of days configuring. Python and Django. Please help EDIT : OK when i restarted the system yesterday it magically turned on. Now i can view my desktop. But one problem, i can't mount any of the drives. It says failed to mount Drive. I'm also frequently getting "Ubuntu System Failure" Error Message.

    Read the article

  • Gateway ZA8 Netbook graphics Issue

    - by Hansel
    The graphics keep tearing and are barely usable anytime I try to use my Netbook. And I did a full install with Ubuntu so I'm pretty much stuck. These are the specs of the Netbook: Processor AMD Athlon™ 64 L110 Single-Core Processor (1.2GHz, 800MHz FSB, 512KB L2 Cache)6 Operating System Genuine Windows Vista® Home Basic (32-bit) with SP1 Display 11.6" HD WXGA Ultrabright™ LED-backlit Display (1366 x 768 resolution, 16:9 aspect ratio)7 Memory 2048MB DDR2 533MHz SDRAM Single Channel Memory8 Hard Drive 250GB SATA hard drive2 Color Classic and Elegant Design with Cherry Red finish Wireless Network 802.11b/g Wi-Fi CERTIFIED®3 Adapter AC Adapter Application Software Microsoft® Works, Microsoft® Money Essentials, Microsoft® Office Home and Student 2007 (60-day complimentary trial period)1 Battery 6-Cell Lithium Ion (5200mAh) Chassis Chassis with ATI Radeon® X1270 Graphics and AMD RS690E Chipset8 Dimensions (Box) 3.1" (H) x 14.8" (W) x 10.1" (D) or 80mm (H) x 376mm (W) x 256mm (D) Dimensions (System) 1.03" (H) x 11.26" (W) x 7.99" (D) or 26.4mm (H) x 286mm (W) x 203mm (D) External Ports (3) USB 2.0, VGA Connector Keyboard and Mouse Keyboard with Multi-Gesture Touchpad Media Card Reader Multi-in-1 Digital Media Card Reader (Memory Stick®, Memory Stick Pro™, MultiMediaCard, Secure Digital™, xD-Picture Card™) Network 10/100 Ethernet LAN (RJ-45 port) What can I do?

    Read the article

  • 2D graphics - why use spritesheets?

    - by Columbo
    I have seen many examples of how to render sprites from a spritesheet but I havent grasped why it is the most common way of dealing with sprites in 2d games. I have started out with 2d sprite rendering in the few demo applications I've made by dealing with each animation frame for any given sprite type as its own texture - and this collection of textures is stored in a dictionary. This seems to work for me, and suits my workflow pretty well, as I tend to make my animations as gif/mng files and then extract the frames to individual pngs. Is there a noticeable performance advantage to rendering from a single sheet rather than from individual textures? With modern hardware that is capable of drawing millions of polygons to the screen a hundred times a second, does it even matter for my 2d games which just deal with a few dozen 50x100px rectangles? The implementation details of loading a texture into graphics memory and displaying it in XNA seems pretty abstracted. All I know is that textures are bound to the graphics device when they are loaded, then during the game loop, the textures get rendered in batches. So it's not clear to me whether my choice affects performance. I suspect that there are some very good reasons most 2d game developers seem to be using them, I just don't understand why.

    Read the article

  • AMD switchable graphics are not working OR I don't know how to make them work

    - by Deus Deceit
    I don't know if this is the right place for this question, but I'll give it a shot. I have a Dell Inspiron 17R 5721. It's supposed to be using switchable graphics. It has an Intel HD 4000 and a Radeon HD 8730M and I'm using windows 8. My problem is this, I Installed the drivers that dell gives me but I don't see the AMD graphics card ANYWHERE ( I do see it in device manager but not anywhere else to select and play a game using AMD). I installed the latest drivers from AMD and same thing, I can't run a game with AMD graphics card. I change the applications preferences in the Catalyst control center but even after that, games don't give me the option to select the AMD card, they list only the Intel HD 4000. Can someone tell me what I have to do to make this work? ------------------- After looking around and messing with stuff. I think.... I THINK that you don't really get the option to select a graphics card. Switchable graphics is all about switching automatically depending on application's needs. Cause when I uninstalled AMD's drivers or actually (screwed up lol) games were playing much worse. When I re-installed them, games went back to being good looking. So even if a game sees only the Intel HD 4000 graphics card windows or AMD's drivers will switch to the AMD Readeon graphics card automatically. I hope someone can verify this. Cause seriously I don't think you get to play scyrim with High graphics settings or even ultra with the Intel HD Graphics card. -------------------

    Read the article

  • Screen problems on 11.10 using VGA compatible controller 2nd Generation Core Processor Family Integrated Graphics Controller

    - by MorrisseyJ
    I am having problems with my display. The problem manifests as lots of screen artefacts, which seem to be worse in Unity than in Gnome 3, are worse after i have used suspend and are intolerable if i set myself up on a dual monitor. Specific issues include: icons disappearing, lines occurring all over the screen, the backgrounds of certain windows going another colour and window borders disappearing or being filled with text from other parts of the screen. The most annoying problem is lines of text disappearing from a host of word processing programmes (libreoffice, gedit, bluefish etc), as i type. In most circumstances the screen problem can be temporarily fixed (so that i can see the screen clearly) by either scrolling the text off the screen and then scrolling it back onto the screen, or highlighting the offending area of the desktop, by clicking and dragging. Errors on parts of the screen that don't seem to redraw (window borders (off the universal menu) or the screen area outside of a LO document, in print layout view, for example) can't seem to be fixed in a session. I am running 11.10, 64 bit on my Thinkpad x121e Display information is: description: VGA compatible controller product: 2nd Generation Core Processor Family Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 09 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:42 memory:d0000000-d03fffff memory:c0000000-cfffffff ioport:4000(size=64) There appear to be a few problems with the Intel graphics and Ubuntu but i am not sure if they are all the same. If anyone knows if this is a known bug it'd be great to know, otherwise i'll file a report. Should anyone know of a fix i would greatly appreciate hearing about it. Let me know if you need any more information. Thanks

    Read the article

  • Problem using graphics.h in Ubuntu

    - by blooooomer
    # include<stdio.h> # include<graphics.h> # include<math.h> using namespace std; int main(void) { int gd=DETECT,gm; int r,x,y,p,xc=320,yc=240; initgraph(&gd,&gm,NULL); cleardevice(); printf("Enter the radius "); scanf("%d",&r); x=0; y=r; putpixel(xc+x,yc-y,1); p=3-(2*r); for(x=0;x<=y;x++) { if (p<0) { y=y; p=(p+(4*x)+6); } else { y=y-1; p=p+((4*(x-y)+10)); } putpixel(xc+x,yc-y,1); putpixel(xc-x,yc-y,2); putpixel(xc+x,yc+y,3); putpixel(xc-x,yc+y,4); putpixel(xc+y,yc-x,5); putpixel(xc-y,yc-x,6); putpixel(xc+y,yc+x,7); putpixel(xc-y,yc+x,8); } getch(); closegraph(); } installed graphics.h compiled using gcc filename.cpp -0 filename -lgraph then used ./filename the window apperared for 10 seconds and the error below appears: [xcb] Unknown sequence number while processing queue [xcb] Most likely this is a multi-threaded client and XInitThreads has not been called [xcb] Aborting, sorry about that. heart: ../../src/xcb_io.c:273: poll_for_event: Assertion `!xcb_xlib_threads_sequence_lost' failed. Aborted Any solutions?

    Read the article

  • System in low-graphics mode

    - by Artem Moskalev
    I am totally new to Linux, and Ubuntu. I bought ASUS X53E Notebook, erased Windows and installed Ubuntu. First it worked ok. Then when I started working with it, i opened the terminal, entered sudo chmod 666 usr and then all the icons from the main panel disappeared + the whole system stopped responding. I decided to restart the system. When restarted, a message appears: the system is running in low graphics mode and below it: Your screen, graphics card and input device setting could not be detected correctly. You will need to configure it yourself. But the "OK" button is disabled and if I press any buttons nothing happens. If I enter Ctrl-Alt-F2 it opens the bash terminal. But there commands sudo or apt-get are not found and it says that permission denied if i try to enter any folder like cd /usr If I enter the su command it asks for the password I don't know. When encountering this problem first, I reinstalled the whole Ubuntu. but today it happened again just the same. What shall I do? maybe there is something wrong with the hardware? If I need to install another distribution of Linux could you recommend one? but I'd rather stick to Debian releases like Ubuntu. so how do I fix the problem? PS: Please give answers in simple terms because am a newbie so i don't know what goes where yet.

    Read the article

  • Collision and Graphics integration

    - by Shlomi Atia
    I'm a little confused about the integration between collision and graphics. They both need to share the same position in the world. The most obvious choice is the center of the entity, which is good for bounding volumes and fixed sized sprites. However, for characters with variable height size sprites like this: http://gamemedia.wcgame.ru/data/2011-07-17/game-sprite-sheet.jpg This is no longer good. The character won't align to the ground if I'll draw it from the center. I can just make the sprites the same height, but it will be a waste of memory (the largest sprite is 4 times larger then the smallest one). Even then, this is not an option at all with skeletal sprites like this one: http://user-generated-content.java-gaming.org/img-vault/212a171fc1ebb27ab77608fb9b2dd9bd9205361ce6300b21a7f8d06d025fbbd8.png It seems that the graphics need to be drawn from the ground for characters, but not for other images such as scenery and obstacles. The only solution I could think of was having another position called draw-position, which is the entity center for images, and is the the bottom of the collision volume for characters. Then when I draw relative to that position, it should work properly. I haven't found any references for something like that, so I'm kinda insecure about it. Does anyone knows of a better approach for this problem? Thanks

    Read the article

  • Latest update to Ubuntu 13.10 broke Intel graphics drivers

    - by James Davies
    I'm running a copy of Ubuntu 13.10 on an i7-4771 w/ Intel HD4600 Graphics using a Dell Ultrasharp 1440p monitor via Displayport. Up until today this configuration has been working perfectly, however the latest update appears to have broken my graphics configuration, and xorg is now refusing to go above 1280p resolution. Running xrandr it appears the driver incorrectly thinks my monitor is plugged into the HDMI port and is detecting a max resolution of 1920x1200 instead of 2560x1440. (It's actually plugged in via Displayport). Based on the apt history.log, the latest update was for the kernel. I'm presuming the issue is that the official Intel driver hasn't been updated to support this version? Is there any way to resolve this, or will I need to upgrade to 14.10 to get the latest driver from Intel? Start-Date: 2014-05-28 11:30:57 Commandline: aptdaemon role='role-commit-packages' sender=':1.473' Install: linux-image-extra-3.11.0-22-generic:amd64 (3.11.0-22.38), linux-image-3.11.0-22-generic:amd64 (3.11.0-22.38), linux-headers-3.11.0-22:amd64 (3.11.0-22.38), linux-headers-3.11.0-22-generic:amd64 (3.11.0-22.38)

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >