Search Results

Search found 12538 results on 502 pages for 'min width'.

Page 319/502 | < Previous Page | 315 316 317 318 319 320 321 322 323 324 325 326  | Next Page >

  • How to add display resolution fo an LCD in Ubuntu 12.04? xrandr problem

    - by SeregaI
    I am fresh for Ubuntu and Linux in general. I have installed Ubuntu 12.04 and stuck trying to setup correct resolution for my LCD display. The native resolution for the LCD is 1920x1080 here is the output from xrandr: $xrandr Screen 0: minimum 320 x 200, current 1280 x 720, maximum 4096 x 4096 LVDS1 connected 1280x720+0+0 (normal left inverted right x axis y axis) 0mm x 0mm 1280x720 60.0*+ 800x600 60.3 56.2 640x480 59.9 VGA1 disconnected (normal left inverted right x axis y axis) Then I create new modeline: $ cvt 1920 1080 60 1920x1080 59.96 Hz (CVT 2.07M9) hsync: 67.16 kHz; pclk: 173.00 MHz Modeline "1920x1080_60.00" 173.00 1920 2048 2248 2576 1080 1083 1088 1120 -hsync +vsync So far so good. then I create new mode using xrandr: $ xrandr --newmode "1920x1080_60.00" 173.00 1920 2048 2248 2576 1080 1083 1088 1120 -hsync +vsync But for some reason that new mode was created for VGA (VGA1) output instead of LCD output (LVDS1): $ xrandr Screen 0: minimum 320 x 200, current 1280 x 720, maximum 4096 x 4096 LVDS1 connected 1280x720+0+0 (normal left inverted right x axis y axis) 0mm x 0mm 1280x720 60.0*+ 800x600 60.3 56.2 640x480 59.9 VGA1 disconnected (normal left inverted right x axis y axis) 1920x1080_60.00 (0xbc) 173.0MHz <---------- ????!!!!!! h: width 1920 start 2048 end 2248 total 2576 skew 0 clock 67.2KHz v: height 1080 start 1083 end 1088 total 1120 clock 60.0Hz So, if I try to add mode to LVDS1, I get an error: $ xrandr --addmode LVDS1 "1920x1080_60.00" X Error of failed request: BadMatch (invalid parameter attributes) Major opcode of failed request: 149 (RANDR) Minor opcode of failed request: 18 (RRAddOutputMode) Serial number of failed request: 25 Current serial number in output stream: 26 Adding that new mode to VGA1 works fine, but I don't use that VGA1 output.

    Read the article

  • Box2D `ApplyLinearImpulse` is not working whereas `SetLinearVelocity` works

    - by Narek
    I need to mimic jumping behavior for the player in my game. Player consists of two fixtures with circle and rectangle shapes. Rectangle I use to detect ground and it is a sensor. Is some point for jumping I do this: float impulseY = body->GetMass() * PLAYER_JUMPING_VEOCITY / PTM_RATIO * std::sin(PLAYER_JUMPING_ANGLE * PI / 180); body->ApplyLinearImpulse(b2Vec2(0, impulseY), body->GetWorldCenter(), true); and player does not jump. But when I do this: body->SetLinearVelocity(b2Vec2(0, PLAYER_JUMPING_VEOCITY / PTM_RATIO * std::sin(PLAYER_JUMPING_ANGLE * PI / 180))); my player jumps. Also when I change the rectangle shape to be normal (not sensor) shape, its works again. Why? Just in case here are the parameters of my rectangular sensor: b2PolygonShape boxShape; boxShape.SetAsBox(width * 0.5/2/PTM_RATIO, height * 0.2/2/PTM_RATIO, b2Vec2(0, -height * 0.4 /PTM_RATIO), 0); b2FixtureDef boxFixtureDef; boxFixtureDef.friction = 0; boxFixtureDef.restitution = 0; boxFixtureDef.density = 1; boxFixtureDef.isSensor = true; boxFixtureDef.userData = static_cast<void*>(PLAYER_GROUP);

    Read the article

  • USB mouse pointer only moving horizontally on macbook 6.2 with 12.04

    - by Glyn Normington
    After installing Ubuntu 12.04 on a macbook pro 6.2, the touchpad and external USB mouse worked perfectly. After rebooting I can't get either touchpad or external USB mouse to work. Sometimes no mouse pointer is visible, but more often I can only move the mouse pointer horizontally five sixths of the way across the display (from the top left). I have uninstalled mouseemu. xinput list shows the USB mouse. xinput query-state for the USB mouse shows the following: ButtonClass button[1]=up ... button[16]=up ValuatorClass Mode=Relative Proximity=In valuator[0]=480 valuator[1]=2400 valuator[2]=0 valuator[3]=3 and re-issuing this command with the pointer at its right hand extreme displays the same except for: valuator[0]=1679 So the valuator[0] seems to be the x-coordinate of the pointer and the range of motion 480-1679 is indeed about five sixths of the display width (1440). valuator[1] is suspiciously large given the display height is 900. Perhaps this is a side-effect of having previously been using a dual monitor (although booting with that monitor connected does not help). There are other entries listed under xinput list: Virtual core XTEST pointer which seems stuck at position (840,1050). bcm5974 which seems stuck at position (837,6700). Removing the bcm5974 module using rmmod disables the toucpad as expected but does not fix the USB mouse problem. After adding the module back, it is stuck at position (840,1050) instead of (837,6700). /etc/X11/xorg.conf was generated by nvidia-settings and contains: Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "ZAxisMapping" "4 5" although I don't know how plausible these settings are. Any suggestions?

    Read the article

  • AS3 How to check on non transparent pixels in a bitmapdata?

    - by Opoe
    I'm still working on my window cleaning game from one of my previous questions I marked a contribution as my answer, but after all this time I can't get it to work and I have to many questions about this so I decided to ask some more about it. As a sequel on my mentioned previous question, my question to you is: How can I check whether or not a bitmapData contains non transparent pixels? Subquestion: Is this possible when the masked image is a movieclip? Shouldn't I use graphics instead? Information I have: A dirtywindow movieclip on the bottom layer and a clean window movieclip on layer 2(mc1) on the layer above. To hide the top layer(the dirty window) I assign a mask to it. Code // this creates a mask that hides the movieclip on top var mask_mc:MovieClip = new MovieClip(); addChild(mask_mc) //assign the mask to the movieclip it should 'cover' mc1.mask = mask_mc; With a brush(cursor) the player wipes of the dirt ( actualy setting the fill from the mask to transparent so the clean window appears) //add event listeners for the 'brush' brush_mc.addEventListener(MouseEvent.MOUSE_DOWN,brushDown); brush_mc.addEventListener(MouseEvent.MOUSE_UP,brushUp); //function to drag the brush over the mask function brushDown(dragging:MouseEvent):void{ dragging.currentTarget.startDrag(); MovieClip(dragging.currentTarget).addEventListener(Event.ENTER_FRAME,erase) ; mask_mc.graphics.moveTo(brush_mc.x,brush_mc.y); } //function to stop dragging the brush over the mask function brushUp(dragging:MouseEvent):void{ dragging.currentTarget.stopDrag(); MovieClip(dragging.currentTarget).removeEventListener(Event.ENTER_FRAME,erase); } //fill the mask with transparant pixels so the movieclip turns visible function erase(e:Event):void{ with(mask_mc.graphics){ beginFill(0x000000); drawRect(brush_mc.x,brush_mc.y,brush_mc.width,brush_mc.height); endFill(); } }

    Read the article

  • Triple-display setup using AMD drivers

    - by Halik
    I am currently running a dual display setup with nVidia 8800GTS video card, on a Ubuntu 12.10 box. The current setup uses nVidia TwinView to render the image on a 1920x1200 display and 1600x1200 one. I'm planning to add a third, 1280x1024 display to the setup. The change will require me to upgrade my GFX card to one supporting triple displays. I'll probably go with Sapphire Radeon 7770 (FLEX edition, to avoid additional active DP-DVI adapters). Before I invest in new GFX I wanted to ask - how well the AMD drivers will support such a setup. It does not matter whether it's fglrx or the OSS ones. If I remember correctly, when running Fedora on a Radeon x800, I had 'void' areas above and below the working area on my second display. The desktop was rendered in 1920+1280 width and 1200 height (which left 176px of vertical space accessible for my cursor and windows but not displayed on the screen - I'd prefer to avoid that). It may have very well been my misconfiguration back then. Generally, are there any solutions from AMD on par with TwinView? Or is it a non-issue at all? Also, I'm wondering about the usual stuff - hardware h264 decoding support, glitch-free flash support, any issues with Compiz/Unity?

    Read the article

  • Draw contour around object in Opengl

    - by Maciekp
    I need to draw contour around 2d objects in 3d space. I tried drawing lines around object(+points to fill the gap), but due to line width, some part of it(~50%) was covering object. I tried to use stencil buffer, to eliminate this problem, but I got sth like this(contour is green): http://goo.gl/OI5uc (sorry I can't post images, due to my reputation) You can see(where arrow points), that some parts of line are behind object, and some are above. This changes when I move camera, but always there is some part, that is covering it. Here is code, that I use for drawing object: glColorMask(1,1,1,1); std::list<CObjectOnScene*>::iterator objIter=ptr->objects.begin(),objEnd=ptr->objects.end(); int countStencilBit=1; while(objIter!=objEnd) { glColorMask(1,1,1,1); glStencilFunc(GL_ALWAYS,countStencilBit,countStencilBit); glStencilOp(GL_REPLACE,GL_KEEP,GL_REPLACE ); (*objIter)->DrawYourVertices(); glStencilFunc(GL_NOTEQUAL,countStencilBit,countStencilBit); glStencilOp(GL_KEEP,GL_KEEP,GL_REPLACE); (*objIter)->DrawYourBorder(); ++objIter; ++countStencilBit; } I've tried different settings of stencil buffer, but always I was getting sth like that. Here is question: 1.Am I setting stencil buffer wrong? 2. Are there any other simple ways to create contour on such objects? Thanks in advance. EDIT: 1. I don't have normals of objects. 2. Object can be concave. 3. I can't use shaders(see below why).

    Read the article

  • Wireless keeps disabling or stays disconnected (Realtek RTL8191SEvB)

    - by jindrichm
    I have Realtek RTL8191SEvB wireless card on Ubuntu 10.10: $ lspci -v | grep Network 03:00.0 Network controller: Realtek Semiconductor Co., Ltd. RTL8191SEvB Wireless LAN Controller (rev 10) When I load its driver, according to the Network Manager it sometimes blinks with a list of available networks but it keeps disabling itself or it stays disconnected. So, I can't connect to any wi-fi network (which results in frustration). The driver is loaded: $ lsmod Module Size Used by r8192se_pci 509932 0 Looks normal: $ sudo lshw -C network *-network description: Wireless interface product: RTL8191SEvB Wireless LAN Controller vendor: Realtek Semiconductor Co., Ltd. physical id: 0 bus info: pci@0000:03:00.0 logical name: wlan0 version: 10 serial: 1c:65:9d:60:c7:7a width: 32 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=rtl819xSE driverversion=0019.1207.2010 firmware=63 latency=0 link=no multicast=yes wireless=802.11bgn resources: irq:17 ioport:2000(size=256) memory:f0500000-f0503fff Configured: $ sudo iwconfig wlan0 wlan0 802.11bgn Nickname:"rtl8191SEVA2" Mode:Managed Frequency=2.412 GHz Access Point: Not-Associated Bit Rate:130 Mb/s Retry:on RTS thr:off Fragment thr:off Encryption key:off Power Management:off Link Quality=10/100 Signal level=0 dBm Noise level=-100 dBm Rx invalid nwid:0 Rx invalid crypt:0 Rx invalid frag:0 Tx excessive retries:0 Invalid misc:0 Missed beacon:0 Is not blocked: $ rfkill list all 0: tpacpi_bluetooth_sw: Bluetooth Soft blocked: no Hard blocked: yes However something's happening with it: $ dmesg [ 6485.948668] InitializeAdapter8190(): ==++==> Turn off RF for RfOffReason(1073741824) ---------- [ 6486.062666] rtl8192_SetWirelessMode(), wireless_mode:10, bEnableHT = 1 [ 6486.062671] InitializeAdapter8192SE(): Set MRC settings on as default!! [ 6486.062675] HW_VAR_MRC: Turn on 1T1R MRC! [ 6486.064091] ADDRCONF(NETDEV_UP): wlan0: link is not ready [ 6486.248761] rtl8192_SetWirelessMode(), wireless_mode:10, bEnableHT = 1 [ 6486.248771] InitializeAdapter8192SE(): Set MRC settings on as default!! [ 6486.248776] HW_VAR_MRC: Turn on 1T1R MRC! [ 6486.580083] GPIOChangeRF - HW Radio OFF [ 6486.610085] ============>sync_scan_hurryup out [ 6486.623814] ================>r8192_wx_set_scan(): hwradio off [ 6486.830484] =========>r8192_wx_set_essid():hw radio off,or Rf state is eRfOff, return So, does anyone know where the problem might be?

    Read the article

  • Heightmap and Textures

    - by Robert
    Im trying to find the "best way" to apply a texture to a heightmap with opengl 3.x. Its really hard to find something on google because tutorials are olds and they're all using different methods, im really lost and i dont know what to use at all. Here is my code that generates the heightmap (its basic) float[] vertexes = null; float[] textureCoords = null; for(int x = 0; x < this.m_size.width; x++) { for(int y = 0; y < this.m_size.height; y++) { vertexes ~= [x, 1.0f, y]; textureCoords ~= [cast(float)x / 50, cast(float)y / 50]; } } As you can see, i dont know how to apply the texture at all (i was using / 50 for my tests). Result of that code : I would like to have something very basic like : (you can find more pics in his blog) Edit : my texture size is 1024x1024.

    Read the article

  • Google Chrome on Ubuntu 12.04 not rendering webpages correctly

    - by sumit_gt
    I am facing some serious web page rendering issues with Chrome. It is more prominent during javascript based animations and stuff on websites like youtube. I have tried removing chrome using (sudo apt-get purge google-chrome-stable) and then reinstalling it. But the problems still persist. The same webpages work correctly on firefox on ubuntu and chrome on windows. The problem only shows up when I use chrome on ubuntu. I think the issue has started after I updated to the latest version of Chrome. I have used Chrome previously on this machine without any problems. I have attached a image that demonstrates the issue. What could possibly be the problem? PS: here's the output of lshw -c video: *-display description: VGA compatible controller product: Madison [Radeon HD 5000M Series] vendor: Hynix Semiconductor (Hyundai Electronics) physical id: 0 bus info: pci@0000:01:00.0 version: 00 width: 64 bits clock: 33MHz capabilities: pm pciexpress msi vga_controller bus_master cap_list rom configuration: driver=fglrx_pci latency=0 resources: irq:46 memory:e0000000-efffffff memory:f0020000-f003ffff ioport:d000(size=256) memory:f0000000-f001ffff Here's the output of lspci -nn: output of lspci -nn

    Read the article

  • Wireless connection disconnects and reconnects with a Netgear WNA1000

    - by William Berkenkamp
    Ever since I made the permanent switch from Vista to Ubuntu i've had wireless connectivity problems. From watching the network manager when it disconnects it seems like it turns off the receiver for some reason. Could it be bad drivers? I used their install software and the site doesn't really offer driver downloads. The adapter is a Netgear WNA1000 if memory serves, and I don't know much about the router except that it's a Motorola Surfboard. And I figure this might help a bit *-network description: Ethernet interface product: RTL8101E/RTL8102E PCI Express Fast Ethernet controller vendor: Realtek Semiconductor Co., Ltd. physical id: 0 bus info: pci@0000:01:00.0 logical name: eth0 version: 01 serial: 00:1b:b9:a7:39:a4 size: 10Mbit/s capacity: 100Mbit/s width: 64 bits clock: 33MHz capabilities: pm vpd msi pciexpress bus_master cap_list rom ethernet physical tp mii 10bt 10bt-fd 100bt 100bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=r8169 driverversion=2.3LK-NAPI duplex=half firmware=N/A latency=0 link=no multicast=yes port=MII speed=10Mbit/s resources: irq:40 ioport:d800(size=256) memory:feaff000-feafffff memory:feac0000-feadffff *-network description: Wireless interface physical id: 1 bus info: usb@2:1.1 logical name: wlan0 serial: 00:26:f2:8b:fb:38 capabilities: ethernet physical wireless configuration: broadcast=yes driver=carl9170 driverversion=3.2.0-24-generic-pae firmware=1.9.4 ip=10.0.0.36 link=yes multicast=yes wireless=IEEE 802.11bgn I have tried installing WICD and it didn't fix the problem. Any help would be greatly appreciated. This problem is greatly limiting what I can do with my computer.

    Read the article

  • Nifty default controls prevent the rest of my game from rendering

    - by zergylord
    I've been trying to add a basic HUD to my 2D LWJGL game using nifty gui, and while I've been successful in rendering panels and static text on top of the game, using the built-in nifty controls (e.g. an editable text field) causes the rest of my game to not render. The strange part is that I don't even have to render the gui control, merely declaring it appears to cause this problem. I'm truly lost here, so even the vaguest glimmer of hope would be appreciated :-) Some code showing the basic layout of the problem: display setup: // load default styles nifty.loadStyleFile("nifty-default-styles.xml"); // load standard controls nifty.loadControlFile("nifty-default-controls.xml"); screen = new ScreenBuilder("start") {{ layer(new LayerBuilder("baseLayer") {{ childLayoutHorizontal(); //next line causes the problem control(new TextFieldBuilder("input","asdf") {{ width("200px"); }}); }}); }}.build(nifty); nifty.gotoScreen("start"); rendering glMatrixMode(GL_PROJECTION); glLoadIdentity(); GLU.gluOrtho2D(0f,WINDOW_DIMENSIONS[0],WINDOW_DIMENSIONS[1],0f); //I can remove the 2 nifty lines, and the game still won't render nifty.render(true); nifty.update(); glMatrixMode(GL_PROJECTION); glLoadIdentity(); GLU.gluOrtho2D(0f,(float)VIEWPORT_DIMENSIONS[0],0f,(float)VIEWPORT_DIMENSIONS[1]); glTranslatef(translation[0],translation[1],0); for (Bubble bubble:bubbles){ bubble.draw(); } for (Wall wall:walls){ wall.draw(); } for(Missile missile:missiles){ missile.draw(); } for(Mob mob:mobs){ mob.draw(); } agent.draw();

    Read the article

  • Isometric Camera trouble - can't rotate or move correctly

    - by Deukalion
    I'm trying to create a 3D editor, but I've been having some trouble with the Camera and understanding each component. I've created 2 camera that works OK, but now I'm trying to implement an Isometric Camera in XNA without success on the rotation and movement of the camera. All I get working is Zoom. (Cube with x=3f, y=3f, z=1f in center) And this is the constructor for my IsometricCamera (inherits from ICamera, with methods for Rotation, Movement and Zoom, and Properties for World/View/Projection matrices) public IsometricCamera3D(GraphicsDevice device, float startClip = -1000f, float endClip = 1000f) { matrix_projection = Matrix.CreateOrthographic(device.Viewport.Width, device.Viewport.Height, startClip, endClip); rotation = Vector3.Zero; matrix_view = Matrix.CreateScale(zoom) * Matrix.CreateRotationY(MathHelper.ToRadians(45 + 180)) * Matrix.CreateRotationX(MathHelper.ToRadians(30)) * Matrix.CreateRotationZ(MathHelper.ToRadians(120)) * Matrix.CreateTranslation(rotation.X, rotation.Y, rotation.Z); } Problem is when I rotate it, all that happens is that the Cube gets more or less shiny and nothing happens. What is wrong and how should I create my View matrix to move it / rotate it correctly? Rotate, Move and Zoom looks like: MethodName(Vector3 rotation/movement), Zoom(float value); and just increases the value, then calls an update to recreate the View Matrix according to the code in the constructor. Currently, in my editor I use MiddleButton + Mouse Movement to rotate the camera, but it's not working as the other camera. But in my default camera I use World Matrix to move, but I guess that's not the best way to go which is why I'm trying this.

    Read the article

  • libGDX using Stage and Actor produces different camera angles on desktop and Android Phone

    - by Brandon
    libGDX using Stage and Actor produces different camera angles on desktop and Android Phone. Here are pictures demonstrating the problem: http://brandonyuh.minus.com/mFpdTSgN17VUq On the desktop version, the image takes up most all the screen. On the Android phone it only takes up a bit of the screen. Here's the code (not my actual project but I isolated the problem): package com.me.mygdxgame2; import com.badlogic.gdx.*; import com.badlogic.gdx.graphics.*; import com.badlogic.gdx.graphics.Texture.TextureFilter; import com.badlogic.gdx.graphics.g2d.*; import com.badlogic.gdx.scenes.scene2d.*; public class MyGdxGame2 implements ApplicationListener { private Stage stage; public void create() { stage = new Stage(); stage.addActor(new ActorHi()); } public void render() { Gdx.gl.glClearColor(0, 1, 0, 1); Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT); stage.draw(); } public void dispose() {} public void resize(int width, int height) {} public void pause() {} public void resume() {} public class ActorHi extends Actor { private Sprite sprite; public ActorHi() { Texture texture = new Texture(Gdx.files.internal("data/hi.png")); texture.setFilter(TextureFilter.Linear, TextureFilter.Linear); sprite = new Sprite(new TextureRegion(texture, 0, 0, 128, 128)); sprite.setBounds(0, 0, 300.0f, 300.0f); } public void draw(SpriteBatch batch, float parentAlpha) { sprite.draw(batch); } } } hi.png is included in the above link Thank you very much for answering my question. I've spent 3 days trying to figure it out.

    Read the article

  • Dim (NEARLY blank) laptop screen, secondary screen works - why?

    - by LIttle Ancient Forest Kami
    My laptop screen is (almost) black while my secondary screen is fine. I believe it to be backlight / brightness related. Problem description it starts when I start the laptop system loads and works fine, just screen has problems I can see the screen though very faintly / dimly - it's hard to see anything which ain't very white e.g. starting screen has big Thinkpad logo in white, large font - I can see it, though very dimly second screen works very well Official backligtht debugging: using acpi setting as prescribed there for Thinkpads didn't help I can see an entry in /sys/class/backlight/ and it changes when I press hotkeys for brightness (current backlight power for instance goes up or down) acpi-off didn't helpm neither did acpi_backlight=vendor Hardware data Laptop is Thinkpad Edge with glossy screen. 4 processors, 2 cores, exemplary CPU data from cat /proc/cpuinfo reports Genuine Intel i5 (M 480 @ 2.67GHz). OS is Ubuntu Lucid, 10.04 LTS, 64-bit, with Linux generic kernel (2.6.32-44) and GNOME 2.32.2 (though I doubt there lies the problem). $ lspci | grep VGA 01:00.0 VGA compatible controller: ATI Technologies Inc M92 [Mobility Radeon HD 4500 Series] $ lshw -C display *-display description: VGA compatible controller product: M92 [Mobility Radeon HD 4500 Series] vendor: ATI Technologies Inc physical id: 0 bus info: pci@0000:01:00.0 version: 00 width: 32 bits clock: 33MHz capabilities: pm pciexpress msi bus_master cap_list rom configuration: driver=radeon latency=0 resources: irq:33 memory:c0000000-dfffffff(prefetchable) ioport:2000(size=256) memory:f0300000-f030ffff memory:f0320000-f033ffff(prefetchable) Driver I was NOT running any proprietary drivers, just checked with "Hardware drivers". There is one for ATI that is suggested there, though I didn't need it so far. UPDATE: changing the driver to proprietary one (ATI/AMD FGLRX) didn't help. Tried and failed Resetting / running on power or battery / charging / getting rid of static electricity / warming up *doesn't help* This is NOT a blank-screen problem, at least it isn't following official Ubuntu black-screen diagnostics - I can see my screen, though barely. What I will try next: - check last updates I've made - IIRC I am running on nomodeset already, but will verify this Any ideas how to proceed best? What is most probable cause?

    Read the article

  • libgdx loading textures fails [duplicate]

    - by Chris
    This question already has an answer here: Why do I get this file loading exception when trying to draw sprites with libgdx? 4 answers I'm trying to load my texture with playerTex = new Texture(Gdx.files.internal("player.jpg")); player.jpg is located under my-gdx-game-android/assets/data/player.jpg I get an exception like this: Full Code: @Override public void create() { camera = new OrthographicCamera(); camera.setToOrtho(false, Gdx.graphics.getWidth(), Gdx.graphics.getHeight()); batch = new SpriteBatch(); FileHandle file = Gdx.files.internal("player.jpg"); playerTex = new Texture(file); player = new Rectangle(); player.x = 800-20; player.y = 250; player.width = 20; player.height = 80; } @Override public void dispose() { // dispose of all the native resources playerTex.dispose(); batch.dispose(); } @Override public void render() { Gdx.gl.glClearColor(1, 1, 1, 1); Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT); camera.update(); batch.setProjectionMatrix(camera.combined); batch.begin(); batch.draw(playerTex, player.x, player.y); batch.end(); if(Gdx.input.isKeyPressed(Keys.DOWN)) player.y -= 50 * Gdx.graphics.getDeltaTime(); if(Gdx.input.isKeyPressed(Keys.UP)) player.y += 50 * Gdx.graphics.getDeltaTime(); }

    Read the article

  • Rich snippet for Google Custom Search - Schema.org

    - by Joesoc
    I am trying to extract the book URL from a link using microdata. The format is specified in schema.org. Here is my html. <div class="col-sm-4 col-md-3" itemscope itemtype="http://schema.org/Book"> <div class="thumbnail"> <img src="{{ book.thumbnailurl }}" itemprop="thumbnailUrl" style="width: 100px;height: 200px;"> <div class="caption"> <h4><span itemprop="name">{{ book.name }}</span> - <span itemprop="author">{{ book.author }}</span></h4> <p><span itemprop="about"> {{ book.about }}</span></p> <p> <a href="{{ book.url }}" itemprop="url" onclick="trackOutboundLink(‘{{ book.name }}’);"> <button type="button" class="btn btn-default btn-md"> <span class="glyphicon glyphicon-book"></span>Read </button> </a> </p> </div> </div> </div> When I use google snippet testing tool the JSON API returns book as a html link. However when I make the call in javascript the value of url is text("Read"). What am i missing ?

    Read the article

  • Rendering a DOM across multiple displays

    - by meetamit
    I'm building a data-driven animation with HTML and javascript to run in a web browser. I would like to display it tiled across three 1080p monitors. This essentially yields a viewport that's 5760px wide and 1080px tall. Pretty large. Does anyone have experience setting up something like this? I have many questions below, but any tip would be appreciated: Is it reasonable to expect a DOM to render into such a large viewport size at close to 60fps? I might choose to use canvas, instead of SVG or HTML, but that would yield a giant canvas. Can a canvas with such high resolution be performant? Of course everything depends on the complexity of the graphics I want to render, but I'm looking to remove that factor from this question, so assume I'm asking about a canvas animation that can run at 60fps at 1920x1080 resolution. Would it run roughly as fast at 3 times the width? Would three.js and WebGL be a more proper approach at that resolution? How do you actually cause Chrome or FF to span 3 monitors at full screen? Do I need a 3rd party solution of any kind? Thanks!

    Read the article

  • Facing a character towards the mouse

    - by ratata
    I'm trying to port a simple 2d top down shooter game from C++(Allegro) to Java and i'm having problems with rotating my character. Here's the code i used in c++ if (keys[A]) RotateRight(player, degree); if (keys[D]) RotateLeft(player, degree); void RotateLeft(Player& player, float& degree) { degree += player.rotatingSpeed; if ( degree >= 360 ) degree = 0; } void RotateRight(Player& player, float& degree) { degree -= player.rotatingSpeed; if ( degree <= 0) degree = 360; } And this is what i have in render section: al_draw_rotated_bitmap(player.image, player.frameWidth / 2, player.frameHeight / 2, player.x, player.y, degree * 3.14159 / 180, 0); Instead of using A-D keys i want to use mouse this time. I've been searching since last night and came up to few sample codes however noone of them worked. For example this just made my character to circle around the map: int centerX = width / 2; int centerY = height / 2; double angle = Math.atan2(centerY - mouseY, centerX - mouseX) - Math.PI / 2; ((Graphics2D)g).rotate(angle, centerX, centerY); g.fillRect(...); // draw your rectangle Any help is much appreciated.

    Read the article

  • Ubuntu 14.04 install fails with Via S3 UniChrome Pro graphics

    - by WizardNo.7
    I am trying to install Ubuntu 14.04 on a Fujitsu Siemens Amilo Pro laptop(it's quite old yes, has about 30GB Hard Drive and I think 192mb of RAM) which currently has Windows XP installed (which I'd like to keep for the time being). I have downloaded the 32-bit Desktop ISO and used unetbootin to create a Live USB for this laptop. When I boot from USB, I arrive at the unetbootin Grey and Blue menu and pick either "Try Ubuntu without installing", or "Install Ubuntu". The menu goes away and an Ubuntu loadscreen showing UBUNTU and four dots which progressively change between white and orange. At about the second color changing cycle a white underscore symbol appears next to the fourth dot and flickers. There is some leftover text from the kernel boot still visible, but there is no graphical desktop. After this I have to hard reboot or shut-down. $ lshw -c video *-display UNCLAIMED description: VGA compatible controller product: CN700/P4M800 Pro/P4M800 CE/VN800 Graphics [S3 UniChrome Pro] Vendor: VIA Technologies, Inc. physical id: 0 bus info: pci@0000:01:00.0 version: 01 width: 32 bits clock: 66MHz capabilities: pm agp agp-2.0 vga_controller bus_master cap_list configuration: latency=64 mingnt=2 resources: memory:f0000000-f3ffffff memory:d1000000-d1ffffff

    Read the article

  • Can't get Unity 3D to work in 11.10

    - by pmoseph
    I recently upgraded to 11.10 on my Lenovo ThinkPad T520, and I'm not able to load Unity 3D (I'm not selecting 2D at login menu either). me@mycomp:~$ echo $DESKTOP_SESSION ubuntu-2d I ran the unity support test below as well. me@mycomp:~$ /usr/lib/nux/unity_support_test -p Xlib: extension "GLX" missing on display ":0.0". Xlib: extension "GLX" missing on display ":0.0". Xlib: extension "GLX" missing on display ":0.0". Error: unable to create the OpenGL context And it looks like I only have one graphics card: me@mycomp:~$ lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) Also, Ubuntu lists nothing under the "Additional Drivers" window. Any help would be extremely appreciated as I'm somewhat of a noob. Thanks! Edit 1: Here is the output of lshw -C display me@mycomp:~$ sudo lshw -C display *-display description: VGA compatible controller product: 2nd Generation Core Processor Family Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 09 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:43 memory:f0000000-f03fffff memory:e0000000-efffffff ioport:5000(size=64)

    Read the article

  • How can I define custom keyboard mappings to resize, move, and manage windows?

    - by fumon
    I just returned to ubuntu (13.04) after a year using OS X exclusively. I love the improvements that have come to ubuntu and unity, and I'm glad to be back. There's just one thing, though... Slate is a simple OS X tool that allows users to quickly create powerful keyboard macros and really take advantage of their screen space. I have to say I was spoiled by it. Even on a tiny laptop, my workflow was never interrupted by changing workspaces or leaving the keyboard to adjust a window, because perfect adjustment was a keystroke or two away. For example: bind h:ctrl;alt;cmd resize -10% +0 # this increases the window's left width by 10% bind h:shift;alt nudge -10% +0 # this moves the window left by 10% You make a big config file, and like vim, tmux, and everything else, it just becomes muscle memory. I can't seem to find a way to achieve anything close to this in linux or ubuntu. I've tried to make do with compiz window settings and the built-in stuff Ubuntu offers, but it's not even in the same realm. Although to be fair, this level of tuning isn't something most people care about. Thanks, guys. :) Any feedback would be appreciated.

    Read the article

  • OpenGL depth texture wrong

    - by CoffeeandCode
    I have been writing a game engine for a while now and have decided to reconstruct my positions from depth... but how I read the depth seems to be wrong :/ What is wrong in my rendering? How I init my depth texture in the FBO gl::BindTexture(gl::TEXTURE_2D, this->textures[0]); // Depth gl::TexImage2D( gl::TEXTURE_2D, 0, gl::DEPTH32F_STENCIL8, width, height, 0, gl::DEPTH_STENCIL, gl::FLOAT_32_UNSIGNED_INT_24_8_REV, nullptr ); gl::TexParameterf(gl::TEXTURE_2D, gl::TEXTURE_MAG_FILTER, gl::NEAREST); gl::TexParameterf(gl::TEXTURE_2D, gl::TEXTURE_MIN_FILTER, gl::NEAREST); gl::TexParameterf(gl::TEXTURE_2D, gl::TEXTURE_WRAP_S, gl::CLAMP_TO_EDGE); gl::TexParameterf(gl::TEXTURE_2D, gl::TEXTURE_WRAP_T, gl::CLAMP_TO_EDGE); gl::FramebufferTexture2D( gl::FRAMEBUFFER, gl::DEPTH_STENCIL_ATTACHMENT, gl::TEXTURE_2D, this->textures[0], 0 ); Linear depth readings in my shader Vertex #version 150 layout(location = 0) in vec3 position; layout(location = 1) in vec2 uv; out vec2 uv_f; void main(){ uv_f = uv; gl_Position = vec4(position, 1.0); } Fragment (where the issue probably is) #version 150\n uniform sampler2D depth_texture; in vec2 uv_f; out vec4 Screen; void main(){ float n = 0.00001; float f = 100.0; float z = texture(depth_texture, uv_f).x; float linear_depth = (n * z)/(f - z * (f - n)); Screen = vec4(linear_depth); // It ISN'T because I don't separate alpha } When Rendered so gamedev.stackexchange, what's wrong with my rendering/glsl?

    Read the article

  • Wifi doesn't connect if I'm upstairs

    - by user11233
    Hello there, I'm finding that Wifi connects when I'm in the living room, where the internet router etc is. But if I try to connect upstairs it is a no go. Booting in Windows 7 (dual boot :D ) I find that Wifi connects. I find this very confusing and have it since the install. Please help! Kind regards. #sudo lshw -c network *-network description: Wireless interface product: PRO/Wireless 5100 AGN [Shiloh] Network Connection vendor: Intel Corporation physical id: 0 bus info: pci@0000:02:00.0 logical name: wlan0 version: 00 serial: 00:21:5d:1a:63:58 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=iwlagn driverversion=2.6.35-26-generic firmware=8.24.2.12 ip=10.0.0.4 latency=0 link=yes multicast=yes wireless=IEEE 802.11abg resources: irq:49 memory:de200000-de201fff Hope it helps! Hello guys, just tried to connect internet upstairs and to my surprise it does connect. Don't know exactly why. Only thing I've done recently is input the codes said bij Oli. Two things I've done on my own 1: installed Jupiter and 2: installed laptop-mode-tools. I really don't know if this can effect my internet but it's the only thing I can think of. Thanks for all the help, really appreciate it :) PS If someone thinks it might be helpful for others I would like to help find a 'real' solution to this problem. Just post down what I need to post here and I will try my best to help.

    Read the article

  • How do I properly center Nifty GUI elements on screen?

    - by Jason Crosby
    I am new to JME3 game engine but I know Android XML GUI layouts pretty good. I have a simple layout here and I cant figure out what is wrong. Here is my XML code: <?xml version="1.0" encoding="UTF-8"?> <nifty xmlns="http://nifty-gui.sourceforge.net/nifty-1.3.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://nifty-gui.sourceforge.net/nifty-1.3.xsd http://nifty-gui.sourceforge.net/nifty-1.3.xsd"> <useControls filename="nifty-default-controls.xml" /> <useStyles filename="nifty-default-styles.xml" /> <screen id="start" controller="com.jasoncrosby.game.farkle.gui.MenuScreenGui"> <layer id="layer" backgroundColor="#66CD00" childLayout="center"> <panel id="panel" align="center" valign="center" childLayout="center" visibleToMouse="true"> <image filename="Textures/wood_floor.png" height="95%" width="95%"/> <panel id="panel" align="center" valign="center" childLayout="center" visibleToMouse="true"> <text text="test" font="Interface/Fonts/Eraser.fnt"></text> </panel> </panel> </layer> </screen> Everything works well until I get to displaying the text. I have tried different alignments and tried moving the text into different panels but no matter what I do the text is never in the center of the screen. It's always in the upper left corner, so far I can only see the lower right part of the text. How can I center the text element in the center of the screen?

    Read the article

  • XNA - Inconsistent accessibility: parameter type is less accessible than method

    - by DijkeMark
    I have a level class in which I make a new turret. I give the turret the level class as parameter. So far so good. Then in the Update function of the Turret I call a function Shoot(), which has that level parameter it got at the moment I created it. But from that moment it gives the following error: Inconsistent accessibility: parameter type 'Space_Game.Level' is less accessible than method 'Space_Game.GameObject.Shoot(Space_Game.Level, string)' All I know it has something to do with not thr right protection level or something like that. The level class: public Level(Game game, Viewport viewport) { _game = game; _viewport = viewport; _turret = new Turret(_game, "blue", this); _turret.SetPosition((_viewport.Width / 2).ToString(), (_viewport.Height / 2).ToString()); } The Turret Class: public Turret(Game game, String team, Level level) :base(game) { _team = team; _level = level; switch (_team) { case "blue": _texture = LoadResources._blue_turret.Texture; _rows = LoadResources._blue_turret.Rows; _columns = LoadResources._blue_turret.Columns; _maxFrameCounter = 10; break; default: break; } _frameCounter = 0; _currentFrame = 0; _currentFrameMultiplier = 1; } public override void Update() { base.Update(); SetRotation(); Shoot(_level, "turret"); } The Shoot Function (Which is in GameObject class. The Turret Class inherited the GameObject Class. (Am I saying that right?)): protected void Shoot(Level level, String type) { MouseState mouse = Mouse.GetState(); if (mouse.LeftButton == ButtonState.Pressed) { switch (_team) { case "blue": switch (type) { case "turret": TurretBullet _turretBullet = new TurretBullet(_game, _team); level.AddProjectile(_turretBullet); break; default: break; } break; default: break; } } } Thanks in Advance, Mark Dijkema

    Read the article

< Previous Page | 315 316 317 318 319 320 321 322 323 324 325 326  | Next Page >