Search Results

Search found 39156 results on 1567 pages for 'device driver development'.

Page 468/1567 | < Previous Page | 464 465 466 467 468 469 470 471 472 473 474 475  | Next Page >

  • How do I connect my NetXtreme BCM5755M Gigabit Ethernet PCI Express in y Dell latitude d630?

    - by Stanton.Sculpture
    Trying to get my 09:00.0 Ethernet controller: Broadcom Corporation NetXtreme BCM5755M Gigabit Ethernet PCI Express (rev 02) working with my wifi. I just installed Raring Ringtail on this dell latitude D630 and I can't get it to connect without a wifi dongle. This is what I got when I typed sudo lshw -c network: *-network description: Ethernet interface product: NetXtreme BCM5755M Gigabit Ethernet PCI Express vendor: Broadcom Corporation physical id: 0 bus info: pci@0000:09:00.0 logical name: eth0 version: 02 serial: 00:21:70:98:04:32 capacity: 1Gbit/s width: 64 bits clock: 33MHz capabilities: pm vpd msi pciexpress bus_master cap_list ethernet physical tp 10bt 10bt-fd 100bt 100bt-fd 1000bt 1000bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=tg3 driverversion=3.128 firmware=5755m-v3.29 latency=0 link=no multicast=yes port=twisted pair resources: irq:44 memory:fe8f0000-fe8fffff *-network description: Wireless interface physical id: 2 bus info: usb@2:1 logical name: wlan0 serial: 7c:dd:90:11:a0:10 capabilities: ethernet physical wireless configuration: broadcast=yes driver=rt2800usb driverversion=3.8.0-31-generic firmware=0.29 ip=10.0.0.8 link=yes multicast=yes wireless=IEEE 802.11bgn Also, when I go to additional drivers in the software and updates settings, no proprietary drivers show up. I've tried sudo apt-get install b43-fwcutter firmware-b43-installer because it worked on a bunch of old Dell laptops that I converted over before, but it didn't work on this one. Is this driver even compatible with wifi? Please help.

    Read the article

  • Endless terrain in jMonkey using TerrainGrid fails to render

    - by nightcrawler23
    I have started to learn game development using jMonkey engine. I am able to create single tile of terrain using TerrainQuad but as the next step I'm stuck at making it infinite. I have gone through the wiki and want to use the TerrainGrid class but my code does not seem to work. I have looked around on the web and searched other forums but cannot find any other code example to help. I believe in the below code, ImageTileLoader returns an image which is the heightmap for that tile. I have modified it to return the same image every time. But all I see is a black window. The Namer method is not even called. terrain = new TerrainGrid("terrain", patchSize, 513, new ImageTileLoader(assetManager, new Namer() { public String getName(int x, int y) { //return "Scenes/TerrainMountains/terrain_" + x + "_" + y + ".png"; System.out.println("X = " + x + ", Y = " + y); return "Textures/heightmap.png"; } })); These are my sources: jMonkeyEngine 3 Tutorial (10) - Hello Terrain TerrainGridTest.java ImageTileLoader This is the result when i use TerrainQuad: , My full code: // Sample 10 - How to create fast-rendering terrains from heightmaps, and how to // use texture splatting to make the terrain look good. public class HelloTerrain extends SimpleApplication { private TerrainQuad terrain; Material mat_terrain; private float grassScale = 64; private float dirtScale = 32; private float rockScale = 64; public static void main(String[] args) { HelloTerrain app = new HelloTerrain(); app.start(); } private FractalSum base; private PerturbFilter perturb; private OptimizedErode therm; private SmoothFilter smooth; private IterativeFilter iterate; @Override public void simpleInitApp() { flyCam.setMoveSpeed(200); initMaterial(); AbstractHeightMap heightmap = null; Texture heightMapImage = assetManager.loadTexture("Textures/heightmap.png"); heightmap = new ImageBasedHeightMap(heightMapImage.getImage()); heightmap.load(); int patchSize = 65; //terrain = new TerrainQuad("my terrain", patchSize, 513, heightmap.getHeightMap()); // * This Works but below doesnt work* terrain = new TerrainGrid("terrain", patchSize, 513, new ImageTileLoader(assetManager, new Namer() { public String getName(int x, int y) { //return "Scenes/TerrainMountains/terrain_" + x + "_" + y + ".png"; System.out.println("X = " + x + ", Y = " + y); return "Textures/heightmap.png"; // set to return the sme hieghtmap image. } })); terrain.setMaterial(mat_terrain); terrain.setLocalTranslation(0,-100, 0); terrain.setLocalScale(2f, 1f, 2f); rootNode.attachChild(terrain); TerrainLodControl control = new TerrainLodControl(terrain, getCamera()); terrain.addControl(control); } public void initMaterial() { // TERRAIN TEXTURE material this.mat_terrain = new Material(this.assetManager, "Common/MatDefs/Terrain/HeightBasedTerrain.j3md"); // GRASS texture Texture grass = this.assetManager.loadTexture("Textures/white.png"); grass.setWrap(WrapMode.Repeat); this.mat_terrain.setTexture("region1ColorMap", grass); this.mat_terrain.setVector3("region1", new Vector3f(-10, 0, this.grassScale)); // DIRT texture Texture dirt = this.assetManager.loadTexture("Textures/white.png"); dirt.setWrap(WrapMode.Repeat); this.mat_terrain.setTexture("region2ColorMap", dirt); this.mat_terrain.setVector3("region2", new Vector3f(0, 900, this.dirtScale)); Texture building = this.assetManager.loadTexture("Textures/building.png"); building.setWrap(WrapMode.Repeat); this.mat_terrain.setTexture("slopeColorMap", building); this.mat_terrain.setFloat("slopeTileFactor", 32); this.mat_terrain.setFloat("terrainSize", 513); } }

    Read the article

  • What is ODBC?

    According to Microsoft, ODBC is a specification for a database API.  This API is database and operating system agnostic due to the fact that the primary goal of the ODBC API is to be language-independent. Additionally, the open functions of the API are created by the manufactures of DBMS-specific drivers. Developers can use these exposed functions from within their own custom applications so that they can communicate with DBMS through the language-independent drivers. ODBC Advantages Multiple ODBC drivers for each DBSM Example Oracle’s ODBC Driver Merant’s Oracle Driver Microsoft’s Oracle Driver ODBC Drivers are constantly updated for the latest data types ODBC allows for more control when querying ODBC allows for Isolation Levels ODBC Disadvantages ODBC Requires DSN ODBC is the proxy between an application and a database ODBC is dependent on third party drivers ODBC Transaction Isolation Levels are related to and limited by the transaction management capabilities of the data source. Transaction isolation levels:  READ UNCOMMITTED Data is allowed to be read prior to the committing of a transaction.  READ COMMITTED Data is only accessible after a transaction has completed  REPEATABLE READ The same data value is read during the entire transaction  SERIALIZABLE Transactions have no effect on other transactions

    Read the article

  • Scanner that worked with Ubuntu 10.4 cannot be found by 13.4

    - by stevecoh1
    My computer previously ran Ubuntu 10.4. After upgrading to 13.4, my Epson scanner no longer can be found by the system. Following documentation, I find the following: $ sane-find-scanner # sane-find-scanner will now attempt to detect your scanner. If the # result is different from what you expected, first make sure your # scanner is powered up and properly connected to your computer. # No SCSI scanners found. If you expected something different, make sure that # you have loaded a kernel SCSI driver for your SCSI adapter. could not open USB device 0x046d/0x082b at 001:007: Access denied (insufficient permissions) ... # No USB scanners found. If you expected something different, make sure that # you have loaded a kernel driver for your USB host controller and have setup # the USB system correctly. See man sane-usb for details. ... If I instead run sudo sane-find-scanner, I get $ sudo sane-find-scanner # sane-find-scanner will now attempt to detect your scanner. If the # result is different from what you expected, first make sure your # scanner is powered up and properly connected to your computer. # No SCSI scanners found. If you expected something different, make sure that # you have loaded a kernel SCSI driver for your SCSI adapter. found USB scanner (vendor=0x04b8 [EPSON], product=0x0131 [EPSON Scanner]) at libusb:001:009 could not fetch string descriptor: Pipe error could not fetch string descriptor: Pipe error # Your USB scanner was (probably) detected. It may or may not be supported by # SANE. Try scanimage -L and read the backend's manpage. So what do I do? scanimage -L does nothing for me and I don't know what the "backend's manpage" might be. It's seems likely that this is a permissions issue since the scanner can be found as root, but I don't know how to solve it. Can someone help?

    Read the article

  • hp pavilion g6 1250 with a BCM 4313 doesn't see any wireless networks

    - by Ahmed Kotb
    i have tried using ubuntu 10.04 and ubuntu 11.10 and both have the same problem the driver is detected by the additional propriety drivers wizard and after installation, ubuntu can't see except on wireless network which is not mine (and i can't connect to it as it is secured) There are plenty of wireless networks around me but ubuntu can't detect them and if i tried to connect to one of them as if it was hidden connection time out. the command lspci -nvn | grep -i net gives 04:00.0 Network controller [0280]: Broadcom Corporation BCM4313 802.11b/g/n Wireless LAN Controller [14e4:4727] (rev 01) 05:00.0 Ethernet controller [0200]: Realtek Semiconductor Co., Ltd. RTL8101E/RTL8102E PCI Express Fast Ethernet controller [10ec:8136] (rev 05) iwconfig gives lo no wireless extensions. eth0 no wireless extensions. wlan0 IEEE 802.11bgn ESSID:off/any Mode:Managed Access Point: Not-Associated Tx-Power=19 dBm Retry long limit:7 RTS thr:off Fragment thr:off Power Management:off i guess it is something related to Broadcom driver .. but i don't know , any help will be appreciated UPDATE: ok i installed a new copy of 11.10 to remove the effect of any trials i have made i followed the link (http://askubuntu.com/q/67806) as suggested all what i have done now is trying the command lsmod | grep brc and it gave me the following brcmsmac 631693 0 brcmutil 17837 1 brcmsmac mac80211 310872 1 brcmsmac cfg80211 199587 2 brcmsmac,mac80211 crc_ccitt 12667 1 brcmsmac then i blacklisted all the other drivers as mentioned in the link the wireless is still disabled.. in the last installation installing the Brodcom STA driver form the additional drivers enabled the menu but as i have said before it wasn't able to connect or even get a list of available networks so what should i do now ? the output of command rfkill list all rfkill list all 0: phy0: Wireless LAN Soft blocked: no Hard blocked: no

    Read the article

  • Java crashes on lubuntu but not Ubuntu

    - by Echogene
    I have lubuntu and Ubuntu partitions on my drive. I've been having an interesting time with the new lubuntu partition. I've encountered strange things with the game Minecraft, Java and graphics drivers on the lubuntu partition. Firstly, I'll say that Minecraft runs fine at about 60fps on the Ubuntu partition with the latest drivers. (This is lower than it should be as it's a pretty decent graphics card [Radeon HD 5700].) When I first started lubuntu, I tried to see if I could get Minecraft running on Java. Java crashed when loading the main game graphics on both Sun and OpenJDK without proprietary drivers. Java also crashed on both Javas with proprietary drivers after the necessary restart. However, after disabling (with 'remove' button) the proprietary drivers with jockey-gtk in the session after the restart to install the drivers, Minecraft ran very well at ~120fps. This didn't continue after another restart, when it ran at 9fps. After failing thereafter on lubuntu to get it working at 15fps, I tried reinstalling lubuntu and installed the exact same driver (the latest one, not the one appearing on jockey) and Java versions as on Ubuntu. That is, now Ubuntu and lubuntu have the same graphics driver and Java version. Minecraft still crashes in the same way on lubuntu but works fine on Ubuntu. I would appreciate any explanation for any of these events. What differences between lubuntu and Ubuntu could cause this? Edit: After installing the 32bit driver version on lubuntu (seeing as lubuntu is 32bit), I have Java "working" for Minecraft. However, it is at <15fps again and it can't log in to servers as it takes too long.

    Read the article

  • cuda install in ubuntu13.10?

    - by hexiangpeng
    the cuda_install_.log show ERROR: Unable to build the NVIDIA kernel module. ERROR: Installation has failed. Please see the file '/var/log/nvidia-installer.log' for details. You may find suggestions on fixing installation problems in the README available on the Linux driver download page at www.nvidia.com. The driver installation is unable to locate the kernel source. Please make sure that the kernel source packages are installed and set up correctly. and the other .log show ^ /tmp/selfgz3964/NVIDIA-Linux-x86-319.37/kernel/nv-i2c.c: In function ‘nv_i2c_del_adapter’: /tmp/selfgz3964/NVIDIA-Linux-x86-319.37/kernel/nv-i2c.c:327:14: error: void value not ignored as it ought to be osstatus = i2c_del_adapter(pI2cAdapter); ^ make[3]: * [/tmp/selfgz3964/NVIDIA-Linux-x86-319.37/kernel/nv-i2c.o] ?? 1 make[2]: * [module/tmp/selfgz3964/NVIDIA-Linux-x86-319.37/kernel] ?? 2 NVIDIA: left KBUILD. nvidia.ko failed to build! make[1]: * [module] ?? 1 make: * [module] ?? 2 - Error. ERROR: Unable to build the NVIDIA kernel module. ERROR: Installation has failed. Please see the file '/var/log/nvidia-installer.log' for details. You may find suggestions on fixing installation problems in the README available on the Linux driver download page at www.nvidia.com. i don't understand

    Read the article

  • How to mix textures in DirectX?

    - by tobsen
    I am new to DirectX development and I am wondering if I am taking the wrong route to achieve the following: I would like to mix three textures which contain transparent areas and some solid areas (Red, Blue, Green). The three textures should blend like shown in this example: How can I achieve that in DirectX (preferably in directx9)? A link or example code would be nice. Update: My rendering method looks like this and I still think I am doing it wrong, because the sprite only shows the last texture (nothing is rendered transparent or blended): void D3DTester::render() { d3ddevice->Clear(0, NULL, D3DCLEAR_TARGET, D3DCOLOR_XRGB(0,0,0), 1.0f, 0); d3ddevice->BeginScene(); d3ddevice->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE); d3ddevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_ONE); d3ddevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_ONE); LPD3DXSPRITE sprite=NULL; HRESULT hres = D3DXCreateSprite(d3ddevice, &sprite); if(hres != S_OK) { throw std::exception(); } sprite->Begin(D3DXSPRITE_ALPHABLEND); std::vector<LPDIRECT3DTEXTURE9>::iterator it; for ( it=textures.begin() ; it < textures.end(); it++ ) { sprite->Draw(*it, NULL, NULL, NULL, 0xFFFFFFFF); } sprite->End(); d3ddevice->EndScene(); d3ddevice->Present(NULL, NULL, NULL, NULL); } The resulting image looks like this: But I need it to look like this instead: Update2: I figured out that I have to SetRenderState after I use sprite->Begin(D3DXSPRITE_ALPHABLEND); thanks to the hint by Josh Petrie. However, by using this: sprite->Begin(D3DXSPRITE_ALPHABLEND); d3ddevice->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE); d3ddevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_ONE); d3ddevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_ONE); std::vector<LPDIRECT3DTEXTURE9>::iterator it; for ( it=textures.begin() ; it < textures.end(); it++ ) { sprite->Draw(*it, NULL, NULL, NULL, 0xFFFFFFFF); } sprite->End(); The sprites colors are becoming transparent towards the background scene e.g.: if I use d3ddevice->Clear(0, NULL, D3DCLEAR_TARGET, D3DCOLOR_XRGB(0,100,21), 1.0f, 0); the result looks like: Is there any way to avoid that? I would like the sprites be transparent to each other but to be still solid to the background. Update3: After having sombody explained to me, how to do what @LaurentCouvidou and @JoshPetrie suggested, I have a working solution and therfore accept the answer: d3ddevice->BeginScene(); D3DCOLOR white = D3DCOLOR_RGBA((UINT)255, (UINT)255, (UINT)255, 255); D3DCOLOR black = D3DCOLOR_RGBA((UINT)0, (UINT)0, (UINT)0, 255); sprite->Begin(D3DXSPRITE_ALPHABLEND); sprite->Draw(pTextureRed, NULL, NULL, NULL, black); sprite->Draw(pTextureGreen, NULL, NULL, NULL, black); sprite->Draw(pTextureBlue, NULL, NULL, NULL, black); sprite->End(); sprite->Begin(D3DXSPRITE_ALPHABLEND); d3ddevice->SetRenderState(D3DRS_ALPHATESTENABLE, TRUE); d3ddevice->SetRenderState(D3DRS_BLENDOP, D3DBLENDOP_ADD); d3ddevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_ONE); d3ddevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_ONE); sprite->Draw(pTextureRed, NULL, NULL, NULL, white); sprite->Draw(pTextureGreen, NULL, NULL, NULL, white); sprite->Draw(pTextureBlue, NULL, NULL, NULL, white); sprite->End(); d3ddevice->EndScene(); d3ddevice->Present(NULL, NULL, NULL, NULL);

    Read the article

  • JDBC Connection Pools in Glassfish

    - by Dana Singleterry
    I've been attempting to configure Glassfish 3.1.2.2 for ADF 11g and the need arose to create a jdbc connection pool to my Oracle XE 11g database. While this is really very trivial there were no samples of how to do this and documentation, while good, rarely ever provides concrete examples. After fumbling around for a few minutes searching for an example I gave up and figured it out on my own. Here are the steps for any of you that may be in need. This can be done either via the Glassfish command line tool asadmin or through the admin console. I'm doing this through the admin console. Start Glassfish and connect to the admin console with the credentials you defined at installation: http://localhost:4848 Navigate to Resources | JDBC | JDBC Connection Pools and select New. Be sure to enter Resource Type & Datasource Classname under General Settings tab. You can go with the defaults for Pool Settings etc... View Image Go to the Additional Properties tab and create username, password, and url properties with the respective values. View Image Navigate to Resources | JDBC | JDBC Resources and select New. Be sure to enter the JNDI Name and select the Pool Name for the jdbc connection pool you created previously. View Image Navigate to Configurations | server-config | JVM Settings and select the JVM Options tab. Add the values highlighted: -Doracle.jdbc.J2EE13Compliant=true is used to make sure the driver behaves in a JEE-compliant manner. View Image To integrate the JDBC driver into a GlassFish Server domain, copy the JAR files into the domain-dir/lib directory, then restart the server. The JAR file for the Oracle 11 database driver is ojdbc6dms.jar. An upcoming entry will demonstrate configuring Glassfish for Oracle ADF Applications.

    Read the article

  • Resolution stuck after playing OpenGL game

    - by kit.yang
    I used to start the game,Frozen Throne (using wine) with the option of "-opengl".When I entered the game,the resolution will changed,and restored after exit the game. But this time a problem happened.The resolution can't restore although I restart my computer several times. Both the Ubuntu pane and login windows are exceptional. nvidia-settingsalso detect the resolution is "1024 x 768",But it seemed useless using this tool. Screenshot-NVIDIA X Server Settings: the result of xrandr: Screen 0: minimum 320 x 240, current 1024 x 768, maximum 1024 x 768 default connected 1024x768+0+0 0mm x 0mm 1024x768 50.0* 800x600 51.0 52.0 53.0 680x384 54.0 55.0 640x480 56.0 576x432 57.0 512x384 58.0 400x300 59.0 60.0 61.0 320x240 62.0 the configure of /etc/X11/xorg.conf: # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 1.0 (buildd@yellow) Fri Apr 9 11:51:21 UTC 2010 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: builtin, VertRefresh source: builtin Identifier "Monitor0" VendorName "Unknown" ModelName "CRT-0" HorizSync 28.0 - 55.0 VertRefresh 43.0 - 72.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "Entry Graphics" EndSection Section "Screen" # Removed Option "metamodes" "1024x768 +0+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" Option "TwinViewXineramaInfoOrder" "CRT-0" Option "metamodes" "1024x768_60 +0+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • Canon Pixma 432 network scanner

    - by Donald Cutler
    I have a problem with a Canon Pixma MX432 Printer/Scanner. I just removed Windows 7 and installed Ubuntu 14.04 8/17/2014 on an older desktop that I built (The computer is an AMD build with an ASUS motherboard). The printer/scanner is an all-in-one unit that is networked in my house via WiFi. All of the computers in my house can access this printer/scanner. My Macbook, my wife's Windows 8 laptop, and my kids mini iPads. I am giving Linux a test-drive with some success as far as setting devices up. But, for the life of me I cannot figure out my scanner issue. If anyone can help I would appreciate it. Make/Model: Canon Pixma MX432 PPD Driver: I have no idea how to get this info. Supported?: no, from the information I gather from old forum posts. Works?: the printer works via WiFi perfectly, but not the scanner. The Simple Scan program sees the scanner, but produces an error when I attempt to scan. I also tried XSANE, but that program does not even detect the scanner. NOTE: THE PRINTER IS WORKING OFF OF AN UBUNTU DRIVER AND NOT A CANON DRIVER. Linux Version: Ubuntu 14.04 I tried the steps in this post, downloaded the "scangearmp-mx430series-1.90-1-deb.tar.gz" file, but could not get the scanner to work. http://ubuntuforums.org/archive/index.php/t-2096430.html any suggestions?

    Read the article

  • Synaptics error "no touchpad found" at startup on HP laptop

    - by Sandra
    I know there are a lot of topics on the synaptics touchpad not working, but I can't seem to find the answer to my problem anywhere. Please forgive me if I missed it. Here is what happened. After using the "ambient light" F11 key on my HP laptop (Compaq nc8430), the touchpad was disabled and I couldn't get it to work. Then I found this thread : "Synaptic touchpad on laptop not working". And Salem's answer totally solved my problem, both for the session fix and for the permanent fix. But since then, after each reboot I get the same error pop-up, with the following content: No touchpad found. No touchpad was found in this system. If the system has a touchpad, please make sure that the synaptics driver is properly installed and configured. If your touchpad is not found, though the driver is installed and configured correctly, please compile detailed information about your touchpad hardware and report this issue to the issue tracker. Needless to say, the driver is indeed installed, activated and configured (even though it's kind of useless now, since I guess the .conf file I created is doing all the work). I know it's a minor issue, because the touchpad does work properly, but it's getting really annoying... So, could anyone please help me fix this error pop-up problem ? Thank you !

    Read the article

  • Help me get my 3D camera to look like the ones in RTS

    - by rFactor
    I am a newbie in 3D game development and I am trying to make a real-time strategy game. I am struggling with the camera currently as I am unable to make it look like they do in RTS games. Here is my Camera.cs class using System; using System.Collections.Generic; using System.Linq; using System.Text; using Microsoft.Xna.Framework; using Microsoft.Xna.Framework.Input; namespace BB { public class Camera : Microsoft.Xna.Framework.GameComponent { public Matrix view; public Matrix projection; protected Game game; KeyboardState currentKeyboardState; Vector3 cameraPosition = new Vector3(600.0f, 0.0f, 600.0f); Vector3 cameraForward = new Vector3(0, -0.4472136f, -0.8944272f); BoundingFrustum cameraFrustum = new BoundingFrustum(Matrix.Identity); // Light direction Vector3 lightDir = new Vector3(-0.3333333f, 0.6666667f, 0.6666667f); public Camera(Game game) : base(game) { this.game = game; } public override void Initialize() { this.view = Matrix.CreateLookAt(this.cameraPosition, this.cameraPosition + this.cameraForward, Vector3.Up); this.projection = Matrix.CreatePerspectiveFieldOfView(MathHelper.PiOver4, this.game.renderer.aspectRatio, 1, 10000); base.Initialize(); } /* Handles the user input * @ param GameTime gameTime */ private void HandleInput(GameTime gameTime) { float time = (float)gameTime.ElapsedGameTime.TotalMilliseconds; currentKeyboardState = Keyboard.GetState(); } void UpdateCamera(GameTime gameTime) { float time = (float)gameTime.ElapsedGameTime.TotalMilliseconds; // Check for input to rotate the camera. float pitch = 0.0f; float turn = 0.0f; if (currentKeyboardState.IsKeyDown(Keys.Up)) pitch += time * 0.001f; if (currentKeyboardState.IsKeyDown(Keys.Down)) pitch -= time * 0.001f; if (currentKeyboardState.IsKeyDown(Keys.Left)) turn += time * 0.001f; if (currentKeyboardState.IsKeyDown(Keys.Right)) turn -= time * 0.001f; Vector3 cameraRight = Vector3.Cross(Vector3.Up, cameraForward); Vector3 flatFront = Vector3.Cross(cameraRight, Vector3.Up); Matrix pitchMatrix = Matrix.CreateFromAxisAngle(cameraRight, pitch); Matrix turnMatrix = Matrix.CreateFromAxisAngle(Vector3.Up, turn); Vector3 tiltedFront = Vector3.TransformNormal(cameraForward, pitchMatrix * turnMatrix); // Check angle so we cant flip over if (Vector3.Dot(tiltedFront, flatFront) > 0.001f) { cameraForward = Vector3.Normalize(tiltedFront); } // Check for input to move the camera around. if (currentKeyboardState.IsKeyDown(Keys.W)) cameraPosition += cameraForward * time * 0.4f; if (currentKeyboardState.IsKeyDown(Keys.S)) cameraPosition -= cameraForward * time * 0.4f; if (currentKeyboardState.IsKeyDown(Keys.A)) cameraPosition += cameraRight * time * 0.4f; if (currentKeyboardState.IsKeyDown(Keys.D)) cameraPosition -= cameraRight * time * 0.4f; if (currentKeyboardState.IsKeyDown(Keys.R)) { cameraPosition = new Vector3(0, 50, 50); cameraForward = new Vector3(0, 0, -1); } cameraForward.Normalize(); // Create the new view matrix view = Matrix.CreateLookAt(cameraPosition, cameraPosition + cameraForward, Vector3.Up); // Set the new frustum value cameraFrustum.Matrix = view * projection; } public override void Update(Microsoft.Xna.Framework.GameTime gameTime) { HandleInput(gameTime); UpdateCamera(gameTime); } } } The problem is that the initial view is looking in a horizontal direction. I would like to have an RTS like top down view (but with a slight pitch). Can you help me out?

    Read the article

  • Abstracting entity caching in XNA

    - by Grofit
    I am in a situation where I am writing a framework in XNA and there will be quite a lot of static (ish) content which wont render that often. Now I am trying to take the same sort of approach I would use when doing non game development, where I don't even think about caching until I have finished my application and realise there is a performance problem and then implement a layer of caching over whatever needs it, but wrap it up so nothing is aware its happening. However in XNA the way we would usually cache would be drawing our objects to a texture and invalidating after a change occurs. So if you assume an interface like so: public interface IGameComponent { void Update(TimeSpan elapsedTime); void Render(GraphicsDevice graphicsDevice); } public class ContainerComponent : IGameComponent { public IList<IGameComponent> ChildComponents { get; private set; } // Assume constructor public void Update(TimeSpan elapsedTime) { // Update anything that needs it } public void Render(GraphicsDevice graphicsDevice) { foreach(var component in ChildComponents) { // draw every component } } } Then I was under the assumption that we just draw everything directly to the screen, then when performance becomes an issue we just add a new implementation of the above like so: public class CacheableContainerComponent : IGameComponent { private Texture2D cachedOutput; private bool hasChanged; public IList<IGameComponent> ChildComponents { get; private set; } // Assume constructor public void Update(TimeSpan elapsedTime) { // Update anything that needs it // set hasChanged to true if required } public void Render(GraphicsDevice graphicsDevice) { if(hasChanged) { CacheComponents(graphicsDevice); } // Draw cached output } private void CacheComponents(GraphicsDevice graphicsDevice) { // Clean up existing cache if needed var cachedOutput = new RenderTarget2D(...); graphicsDevice.SetRenderTarget(renderTarget); foreach(var component in ChildComponents) { // draw every component } graphicsDevice.SetRenderTarget(null); } } Now in this example you could inherit, but your Update may become a bit tricky then without changing your base class to alert you if you had changed, but it is up to each scenario to choose if its inheritance/implementation or composition. Also the above implementation will re-cache within the rendering cycle, which may cause performance stutters but its just an example of the scenario... Ignoring those facts as you can see that in this example you could use a cache-able component or a non cache-able one, the rest of the framework needs not know. The problem here is that if lets say this component is drawn mid way through the game rendering, other items will already be within the default drawing buffer, so me doing this would discard them, unless I set it to be persisted, which I hear is a big no no on the Xbox. So is there a way to have my cake and eat it here? One simple solution to this is make an ICacheable interface which exposes a cache method, but then to make any use of this interface you would need the rest of the framework to be cache aware, and check if it can cache, and to then do so. Which then means you are polluting and changing your main implementations to account for and deal with this cache... I am also employing Dependency Injection for alot of high level components so these new cache-able objects would be spat out from that, meaning no where in the actual game would they know they are caching... if that makes sense. Just incase anyone asked how I expected to keep it cache aware when I would need to new up a cachable entity.

    Read the article

  • Missing Wireless access point

    - by nvcnvn
    I have an: Ubuntu 12.04 with proprietary Boardcom STA Wireless driver installed BCM43224 802.11a/b/g/n wireless card TP-LINK TD-W8101G 54Mbps Wireless ADSL2+ Modem Router Yesterday before I went to sleep, I still connect to my wireless access point, this morning when I start my laptop I don't see it in the list - there are some neighbors listed but not mine. The WLAN is green, with my old Nokia E72, I still see my access point with 100% signal. I have tried to restart my laptop and turn the firmware off/on by the switch but no help. I have read the WirelessTroubleShootingGuide but I can do anything. Here is some of my card info: *-network description: Wireless interface product: BCM43224 802.11a/b/g/n vendor: Broadcom Corporation physical id: 0 bus info: pci@0000:04:00.0 logical name: eth1 version: 01 serial: c0:cb:38:06:5d:53 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=wl0 driverversion=5.100.82.38 latency=0 multicast=yes wireless=IEEE 802.11abgn resources: irq:17 memory:f0500000-f0503fff *-network description: Ethernet interface product: RTL8111/8168B PCI Express Gigabit Ethernet controller vendor: Realtek Semiconductor Co., Ltd. physical id: 0 bus info: pci@0000:09:00.0 logical name: eth0 version: 03 serial: f0:4d:a2:42:ab:e4 size: 100Mbit/s capacity: 1Gbit/s width: 64 bits clock: 33MHz capabilities: pm msi pciexpress msix vpd bus_master cap_list rom ethernet physical tp mii 10bt 10bt-fd 100bt 100bt-fd 1000bt 1000bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=r8169 driverversion=2.3LK-NAPI duplex=full firmware=rtl_nic/rtl8168d-1.fw ip=192.168.1.3 latency=0 link=yes multicast=yes port=MII speed=100Mbit/s resources: irq:42 ioport:5000(size=256) memory:f0404000-f0404fff memory:f0400000-f0403fff memory:f0420000-f043ffff

    Read the article

  • Could not apply the stored configuration for the monitor

    - by dellphi
    I'm using Nvidia 7300 gt and monitor-Acer V173w, on 64 bit Ubuntu 10.04. Compiz and Emerald went well, but at the time of entry into the GUI, I always receive the message : "Could not apply the stored configuration for the monitor, could not find a suitable configuration of screens" Why do I always receive it, and what is wrong with the monitor configuration or pci-e is used? root@dellph1-desktop:/# xrandr Screen 0: minimum 320 x 240, current 1440 x 900, maximum 1440 x 900 default connected 1440x900+0+0 0mm x 0mm 1440x900 50.0* 1024x768 51.0 58.0 59.0 1360x768 52.0 53.0 1152x864 54.0 55.0 56.0 57.0 960x600 60.0 960x540 61.0 896x672 62.0 840x525 63.0 64.0 65.0 66.0 832x624 67.0 800x600 68.0 69.0 70.0 71.0 72.0 73.0 800x512 74.0 720x450 75.0 680x384 76.0 77.0 640x512 78.0 79.0 640x480 80.0 81.0 82.0 83.0 576x432 84.0 85.0 86.0 87.0 512x384 88.0 89.0 90.0 416x312 91.0 400x300 92.0 93.0 94.0 95.0 320x240 96.0 97.0 98.0 root@dellph1-desktop:/# === xorg.conf # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 260.19.29 ([email protected]) Wed Dec 8 12:27:27 PST 2010 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: xconfig, VertRefresh source: xconfig Identifier "Monitor0" VendorName "Unknown" ModelName "Acer V173W" HorizSync 30.0 - 83.0 VertRefresh 55.0 - 75.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 7300 GT" EndSection Section "Screen" # Removed Option "metamodes" " 1440x900_60 +0+0; 1280x1024 +0+0" # Removed Option "metamodes" "1440x900 +0+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" Option "TwinViewXineramaInfoOrder" "CRT-0" Option "metamodes" "1440x900_75 +0+0; 1440x900 +0+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • Frequent Disconnects with an Intel 3945ABG Wireless Card

    - by Alex Forsythe
    I'm brand new to Ubuntu, and I really love it so far, but one issue I have encountered is that my WLAN is disconnecting about every 5-10 minutes. Often times the connection is repaired automatically, but sometimes the network manager will repeatedly reject my encryption key (which of course is correct). Occasionally after a disconnect, the wireless network fails to show up at all. The only way I can solve this seems to be by completely restarting Ubuntu or connecting with a USB wireless adapter. I am using WPA/WPA2 encryption, which I've read can cause problems with network-manager, but I experience the exact same issues with WICD. I should probably note that I've not experienced any of these issues using Windows 7 on my other partition. I have a hunch that there may be a better driver out there for my card, but I have no idea how to go about searching for it or installing it. Any help would be really appreciated! lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 11.10 Release: 11.10 Codename: oneiric lspci -nnk 09:00.0 Ethernet controller [0200]: Broadcom Corporation NetXtreme BCM5752 Gigabit Ethernet PCI Express [14e4:1600] (rev 02) Subsystem: Dell Device [1028:0201] Kernel driver in use: tg3 Kernel modules: tg3 0c:00.0 Network controller [0280]: Intel Corporation PRO/Wireless 3945ABG [Golan] Network Connection [8086:4222] (rev 02) Subsystem: Intel Corporation Device [8086:1020] Kernel driver in use: iwl3945 Kernel modules: iwl3945 rfkill list 0: phy0: Wireless LAN Soft blocked: no Hard blocked: no 2: dell-wifi: Wireless LAN Soft blocked: no Hard blocked: no 3: dell-bluetooth: Bluetooth Soft blocked: yes Hard blocked: no

    Read the article

  • lirc_zilog IR transmission no longer working with HD-PVR on 12.04

    - by johnf
    I have been running a ubuntu 10.04 with a patched version of lirc_zilog for two years. I upgraded to 12.04 and lirc_zilog is no longer working with my HD-PVR. The MythTV wiki reports that it did work out of the box with 11.04. The error message I get on irsend is as follows: johnf@carbon:~$ /usr/local/bin/irsend SEND_ONCE blaster 0_130_KEY_POWER irsend: command failed: SEND_ONCE blaster 0_130_KEY_POWER irsend: hardware does not support sending The lircd daemon, run interactively, reports the following: lircd: accepted new client on /var/run/lirc/lircd lircd: could not get hardware features lircd: this device driver does not support the LIRC ioctl interface lircd: major number of /dev/lirc0 is 250 lircd: LIRC major number is 61 lircd: check if /dev/lirc0 is a LIRC device lircd: WARNING: Failed to initialize hardware lircd: error processing command: SEND_ONCE blaster 0_130_KEY_POWER lircd: hardware does not support sending lircd: removed client Checking dmesg seems to indicate that the kernel module is loading properly: [56497.730743] lirc_zilog: module is from the staging directory, the quality is unknown, you have been warned. [56497.730999] lirc_zilog: Zilog/Hauppauge IR driver initializing [56497.732484] lirc_zilog: ir_probe: ir_rx_z8f0811_hdpvr on i2c-0 (Hauppage HD PVR I2C), client addr=0x71 [56497.732493] lirc_zilog: ir_probe: ir_tx_z8f0811_hdpvr on i2c-0 (Hauppage HD PVR I2C), client addr=0x70 [56497.732496] lirc_zilog: probing IR Tx on Hauppage HD PVR I2C (i2c-0) [56497.756822] lirc_zilog: firmware of size 302355 loaded [56497.756945] lirc_zilog: 743 IR blaster codesets loaded [56497.757030] i2c i2c-0: lirc_dev: driver lirc_zilog registered at minor = 0 [56497.757033] lirc_zilog: IR unit on Hauppage HD PVR I2C (i2c-0) registered as lirc0 and ready [56497.757035] lirc_zilog: probe of IR Tx on Hauppage HD PVR I2C (i2c-0) done [56497.757056] lirc_zilog: initialization complete Here is my /etc/lirc/hardware.conf #Chosen IR Transmitter TRANSMITTER="HD-PVR" TRANSMITTER_MODULES="lirc_dev lirc_zilog" TRANSMITTER_DRIVER="" TRANSMITTER_DEVICE="/dev/lirc0" TRANSMITTER_SOCKET="" TRANSMITTER_LIRCD_CONF="" TRANSMITTER_LIRCD_ARGS="" My lircd.conf is a copy of the recommended one. Examination of the kernel source seems to indicate that the lirc_zilog module should support transmission, it's newer than the patched version I was manually compiling on 10.04. I was previously using a manually built version of lirc 0.8.7 and not the packaged one. I'm now running the packaged version 9.0. I can provide any additional information required and will perform tests quickly. I'm very eager to get this issue resolved.

    Read the article

  • Why are my 32bit OpenGL libraries pointing to mesa instead of nvidia, and how do I fix it?

    - by Codemonkey
    I have installed Nvidia's drivers on my Ubuntu 13 system, but according to this command (ldconfig -p | grep GL): $ ldconfig -p | grep GL libQtOpenGL.so.4 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libQtOpenGL.so.4 libGLU.so.1 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libGLU.so.1 libGLEWmx.so.1.8 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libGLEWmx.so.1.8 libGLEW.so.1.8 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/libGLEW.so.1.8 libGLESv2.so.2 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/mesa-egl/libGLESv2.so.2 libGL.so.1 (libc6,x86-64) => /usr/lib/libGL.so.1 libGL.so.1 (libc6) => /usr/lib/i386-linux-gnu/mesa/libGL.so.1 libGL.so (libc6,x86-64) => /usr/lib/libGL.so libEGL.so.1 (libc6,x86-64) => /usr/lib/x86_64-linux-gnu/mesa-egl/libEGL.so.1 The 32bit version of OpenGL is pointing to mesa's libraries instead of nvidia. This causes my Steam games to refuse to launch with the error: Could not find required OpenGL entry point 'glGetError'! Either your video card is unsupported, or your OpenGL driver needs to be updated. Why is this the case? When the nvidia installer asked me if I wanted to install "32bit compatability libraries" (or something like that) I chose yes. How do I fix this? Edit: I just reinstalled the same Nvidia driver, and that apparently removed the 32bit OpenGL driver completely: $ ldconfig -p | grep libGL.so libGL.so.1 (libc6,x86-64) => /usr/lib/libGL.so.1 libGL.so (libc6,x86-64) => /usr/lib/libGL.so Now Steam won't start: You are missing the following 32-bit libraries, and Steam may not run: libGL.so.1 Again, I chose YES when the installer asked me if I wanted to install 32bit libraries. Why are they not installed!?

    Read the article

  • Screen flickering when using midrange brightness values on Dell XPS

    - by Eliran Malka
    After a fresh Ubuntu install on my laptop, I discovered the function keys for screen brightness control (Fn+F4 and Fn+F5) are not working. Digging around here, I managed to get it to work by following the solution suggested on this post and that one, but alas — after applying it, a strange problem occurred: Setting the brightness level to any value other than minimum or maximum, the screen starts flickering back and forth from the selected level to full brightness, apparently due to Dell's power saver attempting to dim the screen to adjust the brightness levels. I looked up for a solution here on the site, and possibly everywhere, with no avail. Also tried: To manually control the brightness by configuring the ACPI level (setting values by echo [some_value] | sudo tee /sys/class/backlight/[vendor]_backlight/[some_key], without success. Installing the Intel graphics driver, thinking it's missing. Realized it's installed out of the box by installing Mesa Utils. How to resolve this? Environment Model: Dell Studio XPS 13 OS: Windows 7 64bit / Ubuntu 12.04 32bit (dual boot) Graphics Driver: Intel HD 3000 (Sandybridge Mobile x86/MMX/SSE2) lshw -C display output: *-display description: VGA compatible controller product: 2nd Generation Core Processor Family Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 09 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:47 memory:f0000000-f03fffff memory:e0000000-efffffff ioport:2000(size=64)

    Read the article

  • Centrino Wireless-N 1000 takes forever to connect and keeps asking for password

    - by waclock
    A few days ago I started having this problem. When I tried to connect to any WiFi Connection it would stay connecting forever, and after a minute or so it would ask me for the password again. The strange thing is that this happened out of nowhere, I did not install any new drivers or anything like that. After this happened I decided to uninstall ubuntu and install it again ("inside windows") but the problem is still there. Any suggestions would be greatly appreciated. 0: hp-wifi: Wireless LAN Soft blocked: no Hard blocked: no 1: hp-bluetooth: Bluetooth Soft blocked: yes Hard blocked: no 2: phy0: Wireless LAN Soft blocked: no Hard blocked: no description: Ethernet interface product: RTL8111/8168B PCI Express Gigabit Ethernet controller vendor: Realtek Semiconductor Co., Ltd. physical id: 0 bus info: pci@0000:07:00.0 logical name: eth0 version: 06 serial: 2c:27:d7:aa:e4:7d size: 10Mbit/s capacity: 1Gbit/s width: 64 bits clock: 33MHz capabilities: pm msi pciexpress msix vpd bus_master cap_list ethernet physical tp mii 10bt 10bt-fd 100bt 100bt-fd 1000bt 1000bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=r8169 driverversion=2.3LK-NAPI duplex=half firmware=rtl8168e-3_0.0.4 03/27/12 latency=0 link=no multicast=yes port=MII speed=10Mbit/s resources: irq:50 ioport:4000(size=256) memory:c0404000-c0404fff memory:c0400000-c0403fff *-network description: Wireless interface product: Centrino Wireless-N 1000 vendor: Intel Corporation physical id: 0 bus info: pci@0000:0d:00.0 logical name: wlan0 version: 00 serial: 00:1e:64:09:9c:58 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=iwlwifi driverversion=3.2.0-23-generic-pae firmware=39.31.5.1 build 35138 latency=0 link=no multicast=yes wireless=IEEE 802.11bgn resources: irq:52 memory:c4500000-c4501fff *-network description: Ethernet interface physical id: 1 bus info: usb@2:1.2 logical name: eth1 serial: ee:85:2f:7d:80:96 capabilities: ethernet physical configuration: broadcast=yes driver=ipheth ip=172.20.10.2 link=yes multicast=yes

    Read the article

  • How would I handle input with a Game Component?

    - by Aufziehvogel
    I am currently having problems from finding my way into the component-oriented XNA design. I read an overview over the general design pattern and googled a lot of XNA examples. However, they seem to be right on the opposite site. In the general design pattern, an object (my current player) is passed to InputComponent::update(Player). This means the class will know what to do and how this will affect the game (e.g. move person vs. scroll text in a menu). Yet, in XNA GameComponent::update(GameTime) is called automatically without a reference to the current player. The only XNA examples I found built some sort of higher-level Keyboard engine into the game component like this: class InputComponent: GameComponent { public void keyReleased(Keys); public void keyPressed(Keys); public bool keyDown(Keys); public void override update(GameTime gameTime) { // compare previous state with current state and // determine if released, pressed, down or nothing } } Some others went a bit further making it possible to use a Service Locator by a design like this: interface IInputComponent { public void downwardsMovement(Keys); public void upwardsMovement(Keys); public bool pausedGame(Keys); // determine which keys pressed and what that means // can be done for different inputs in different implementations public void override update(GameTime); } Yet, then I am wondering if it is possible to design an input class to resolve all possible situations. Like in a menu a mouse click can mean "click that button", but in game play it can mean "shoot that weapon". So if I am using such a modular design with game components for input, how much logic is to be put into the InputComponent / KeyboardComponent / GamepadComponent and where is the rest handled? What I had in mind, when I heard about Game Components and Service Locator in XNA was something like this: use Game Components to run the InputHandler automatically in the loop use Service Locator to be able to switch input at runtime (i.e. let player choose if he wants to use a gamepad or a keyboard; or which shall be player 1 and which player 2). However, now I cannot see how this can be done. First code example does not seem flexible enough, as on a game pad you could require some combination of buttons for something that is possible on keyboard with only one button or with the mouse) The second code example seems really hard to implement, because the InputComponent has to know in which context we are currently. Moreover, you could imagine your application to be multi-layered and let the key-stroke go through all layers to the bottom-layer which requires a different behaviour than the InputComponent would have guessed from the top-layer. The general design pattern with passing the Player to update() does not have a representation in XNA and I also cannot see how and where to decide which class should be passed to update(). At most time of course the player, but sometimes there could be menu items you have to or can click I see that the question in general is already dealt with here, but probably from a more elobate point-of-view. At least, I am not smart enough in game development to understand it. I am searching for a rather code-based example directly for XNA. And the answer there leaves (a noob like) me still alone in how the object that should receive the detected event is chosen. Like if I have a key-up event, should it go to the text box or to the player?

    Read the article

  • Nvidia dual monitor configuration gets lost every time I reboot

    - by sunwukung
    I've recently updated (well, borked then completely reinstalled) to 12.04. I'm running a dual monitor setup, with a Dell U2410 / Dell 2007WFP combination on an HP Elite Book 8560W. The graphics card is an NVIDIA GF108 [Quadro 1000M]. My problem is as follows. I can get the dual monitor setup working fine, but every time I reboot, my machine appears to lose the settings (specifically, the U2410 is disabled, the mouse pointer is locked in the launcher). I have to restate the settings after every launch. I've tried running nvidia-settings as sudo, I've save the changes to my xorg.conf file (see below) but nothing seems to be sticking. Has anyone had similair issues, or know of a fix? Conf file follows: # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 295.33 (buildd@allspice) Fri Mar 30 15:25:24 UTC 2012 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor0" VendorName "Unknown" ModelName "DELL 2007WFP" HorizSync 30.0 - 83.0 VertRefresh 56.0 - 76.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "Quadro 1000M" EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "1" Option "TwinViewXineramaInfoOrder" "DFP-1" Option "metamodes" "CRT: 1680x1050 +1920+0, DFP-1: 1920x1200 +0+0; CRT: nvidia-auto-select +0+0, DFP-1: NULL" SubSection "Display" Depth 24 EndSubSection EndSection The error message I'm getting is this: none of the selected modes were compatible with the possible modes: Trying modes for CRTC 642: CRTC 642: trying mode 3600x1080@50hz with output at 1280 x 1024@0Hz (pass 0) CRTC 642: trying mode 3600x1080@50hz with output at 1280 x 1024@0Hz (pass 0) CRTC 642: trying mode 3600x1080@50hz with output at 1280 x 1024@0Hz (pass 0) CRTC 642: trying mode 3600x1080@50hz with output at 1280 x 1024@0Hz (pass 1) CRTC 642: trying mode 3600x1080@50hz with output at 1280 x 1024@0Hz (pass 1) CRTC 642: trying mode 3600x1080@50hz with output at 1280 x 1024@0Hz (pass 1)

    Read the article

  • Detect multitouch (two fingers touch) on a sprite to apply pinch zoom behaviour

    - by Tahreem
    I am using andengine, want to move, zoom and rotate multiple sprites individually on a scene. I have achieved "move" but for pinch zoom i am unable to get the event to two fingers' touch. Below is the code: public class Main extends SimpleBaseGameActivity { private Camera camera; private BitmapTextureAtlas mBitmapTextureAtlas; private ITextureRegion mFaceTextureRegion; private ITextureRegion mFaceTextureRegion2; Sprite face2; private static final int CAMERA_WIDTH = 800; private static final int CAMERA_HEIGHT = 480; @Override public EngineOptions onCreateEngineOptions() { camera = new Camera(0, 0, CAMERA_WIDTH, CAMERA_HEIGHT); EngineOptions engineOptions = new EngineOptions(true, ScreenOrientation.LANDSCAPE_FIXED, new RatioResolutionPolicy( CAMERA_WIDTH, CAMERA_HEIGHT), camera); return engineOptions; } @Override protected void onCreateResources() { BitmapTextureAtlasTextureRegionFactory.setAssetBasePath("gfx/"); this.mBitmapTextureAtlas = new BitmapTextureAtlas( this.getTextureManager(), 1024, 1600, TextureOptions.NEAREST); BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(this.mBitmapTextureAtlas, // this, "ui_ball_1.png", 0, 0, 1, 1), // this.getVertexBufferObjectManager()); this.mFaceTextureRegion = BitmapTextureAtlasTextureRegionFactory .createFromAsset(this.mBitmapTextureAtlas, this, "ui_ball_1.png", 0, 0); this.mFaceTextureRegion2 = BitmapTextureAtlasTextureRegionFactory .createFromAsset(this.mBitmapTextureAtlas, this, "ui_ball_1.png", 0, 0); this.mBitmapTextureAtlas.load(); this.mEngine.getTextureManager().loadTexture(this.mBitmapTextureAtlas); } @Override protected Scene onCreateScene() { this.mEngine.registerUpdateHandler(new FPSLogger()); final Scene scene = new Scene(); scene.setBackground(new Background(0.09804f, 0.6274f, 0.8784f)); final float centerX = (CAMERA_WIDTH - this.mFaceTextureRegion .getWidth()) / 2; final float centerY = (CAMERA_HEIGHT - this.mFaceTextureRegion .getHeight()) / 2; final Sprite face = new Sprite(centerX, centerY, this.mFaceTextureRegion, this.getVertexBufferObjectManager()) { @Override public boolean onAreaTouched(final TouchEvent pSceneTouchEvent, final float pTouchAreaLocalX, final float pTouchAreaLocalY) { this.setPosition(pSceneTouchEvent.getX() - this.getWidth() / 2, pSceneTouchEvent.getY() - this.getHeight() / 2); return true; } }; face.setScale(2); scene.attachChild(face); scene.registerTouchArea(face); face2 = new Sprite(200, 200, this.mFaceTextureRegion2, this.getVertexBufferObjectManager()) { @Override public boolean onAreaTouched(final TouchEvent pSceneTouchEvent, final float pTouchAreaLocalX, final float pTouchAreaLocalY) { switch(pSceneTouchEvent.getAction()){ case TouchEvent.ACTION_DOWN: int count = pSceneTouchEvent.getMotionEvent().getPointerCount() ; for(int i= 0; i <count; i++) { int id = pSceneTouchEvent.getMotionEvent().getPointerId(i); } break; case TouchEvent.ACTION_MOVE: this.setPosition(pSceneTouchEvent.getX() -this.getWidth() / 2, pSceneTouchEvent.getY()-this.getHeight() / 2); break; case TouchEvent.ACTION_UP: break; } return true; } }; face2.setScale(2); scene.attachChild(face2); scene.registerTouchArea(face2); scene.setTouchAreaBindingOnActionDownEnabled(true); return scene; } } This line int count = pSceneTouchEvent.getMotionEvent().getPointerCount() ; should set 2 to the count variable if i touch the sprite with to fingers, then i can apply zooming functionality (setScale method) on the sprite by getting the distance between the coordinates of two fingers. Can anyone help me? why it does not detect two fingers? and without this how can i zoom the sprite on pinch of two fingers? I am very new to game development, any help would be appreciated. Thanks in advance.

    Read the article

  • Hybrid Graphics Functional won't work with my Asus UL30V anymore

    - by futuress
    The problem is that I am no longer able to boot in compatibility mode for just turning on my Nvidia graphics to install the driver. Because no login screen will appear if Ubuntu is loading. In Ubuntu 11.10 I was able to activate nvidia graphics only' option this way: 1) Change BIOS to 'compatibility mode' which will turn off the Intel card. 2) Install the Nvidia proprietary driver using Ubuntu's driver finder (Additional Drivers) and then reboot. I was not interested using only the Intel graphics, for the sake of battery life. Now I have both cards running and they drain my battery life dramatically. And the main problem of this configuration no OpenGL is available, so I can't play any games any more. At this point, I have a pre-solution. I uninstalled the nvidia drivers and installed bumblebee. Now the Intel card is recognized. I would prefer to run just the nvidia card as in Ubuntu 11.10 but for now this is better than nothing. Does anybody else have the same problem?

    Read the article

< Previous Page | 464 465 466 467 468 469 470 471 472 473 474 475  | Next Page >