Search Results

Search found 2187 results on 88 pages for 'combined fragment'.

Page 33/88 | < Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >

  • Deferred rendering order?

    - by Nick Wiggill
    There are some effects for which I must do multi-pass rendering. I've got the basics set up (FBO rendering etc.), but I'm trying to get my head around the most suitable setup. Here's what I'm thinking... The framebuffer objects: FBO 1 has a color attachment and a depth attachment. FBO 2 has a color attachment. The render passes: Render g-buffer: normals and depth (used by outline & DoF blur shaders); output to FBO no. 1. Render solid geometry, bold outlines (as in toon shader), and fog; output to FBO no. 2. (can all render via a single fragment shader -- I think.) (optional) DoF blur the scene; output to the default frame buffer OR ELSE render FBO2 directly to default frame buffer. (optional) Mesh wireframes; composite over what's already in the default framebuffer. Does this order seem viable? Any obvious mistakes?

    Read the article

  • Is a 302 redirect to a random URL from the homepage an SEO problem?

    - by CookieMonster
    I originally posted this on Stackoverflow, but I believe here is a better place to ask. My web application is very similar to notepad.cc which redirects to a randomly generated URL upon access, e.g. http://myapp.com/roTr94h4Gd. (Please note that notepad.cc is not my site.) Probably because of this redirect feature, when I do "fetch as Google" or "fetch as Bingbot", I get a 302 and no html content. Not even a <html></html> tag. HTTP/1.1 302 Moved Temporarily Server: nginx/1.4.1 Date: Tue, 01 Oct 2013 04:37:37 GMT Content-Type: text/html Transfer-Encoding: chunked Connection: keep-alive X-Powered-By: PHP/5.4.17-1~dotdeb.1 Set-Cookie: PHPSESSID=vp99q5e5t5810e3bnnnvi6sfo2; expires=Thu, 03-Oct-2013 04:37:37 GMT; path=/ Expires: Thu, 19 Nov 1981 08:52:00 GMT Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Pragma: no-cache Location: /roTr94h4Gd How should I avoid 302 in this case? I suppose I could modify my site to prevent the redirect, but it is a necessary feature of my web app to generate a random URL on each access. I added <meta name="fragment" content="!"> tag into my index page and set it to return a static snapshot of my page when the flag is set. But this still returns a 302. I also added a header to return 200 before redirecting, but this had no effect, either. Could someone tell me a good suggestion to solve this problem?

    Read the article

  • Questions before I revamp my rendering engine to use shaders (GLSL)

    - by stephelton
    I've written a fairly robust rendering engine using OpenGL ES 1.1 (fixed-function.) I've been looking into revamping the engine to use OpenGL ES 2.0, which necessitates that I use shaders. I've been absorbing information all day long and still have some questions. Firstly, lighting. The fixed-function pipeline is guaranteed to have at least 8 lights available. My current engine finds lights that are "close" to the primitives being drawn and enables them; I don't know how many lights are going to be enabled until I draw a given model. Nothing is dynamically allocated in GLSL, so I have to define in a shader some number of lights to be used, right? So if I want to stick with 8, should I write my general purpose shader to have 8 lights and then use uniforms to tell it how many / which lights to use? Which brings me to another question: should I be concerned with the amount of data I'm allocating in a shader? Recent video cards have hundreds of "stream processors." If I've got a fragment shader being used on some number of fragments in a given triangle, I assume they must each have their own stack to work on. Are read-only variables copied here, or read when needed? My initial goal is to rework my code so that it is virtually identical to the current implementation. What I have in mind is to create my own matrix stack so that I can implement something along the lines of push/popMatrix and apply all my translations, rotations, and scales to this matrix, then provide the matrix to the vertex shader so that it can make very quick vertex translations. Is this approach sound? Edit: My original intention was to ask if there was a tutorial that would explain the bare minimum necessary to jump from fixed-function to using shaders. Thanks!

    Read the article

  • Views : ViewControllers, many to one, or one to one?

    - by conor
    I have developed an Android application where, typically, each view (layout.xml) displayed on the screen has it's own corresponding fragment (for the purpose of this question I may refer to this as a ViewController). These views and Fragments/ViewControllers are appropriately named to reflect what they display. So this has the effect of allowing the programmer to easily pinpoint the files associated with what they see on any given screen. The above refers to the one to one part of my question. Please note that with the above there are a few exceptions where very similar is displayed on two views so the ViewController is used for two views. (Using a simple switch (type) to determine what layout.xml file to load) On the flip side. I am currently working on the iOS version of the same app, which I didn't develop. It seems that they are adopting more of a one-to-many (ViewController:View) approach. There appears to be one ViewController that handles the display logic for many different types of views. In the ViewController are an assortment of boolean flags and arrays of data (to be displayed) that are used to determine what view to load and how to display it. This seems very cumbersome to me and coupled with no comments/ambiguous variable names I am finding it very difficult to implement changes into the project. What do you guys think of the two approaches? Which one would you prefer? I'm really considering putting in some extra time at work to refactor the iOS into a more 1:1 oriented approach. My reasoning for 1:1 over M:1 is that of modularity and legibility. After all, don't some people measure the quality of code based on how easy it is for another developer to pick up the reigns or how easy it is to pull a piece of code and use it somewhere else?

    Read the article

  • Low Graphics and laggy screen after update to 13.10 from 13.04

    - by Wh0RU
    After updating from 13.04 to 13.10 many of 3D unity stuff has stopped working, I am suing Samsung Series 3 (NP350V5X) Laptop which has Switchable Intel and AMD Radeon HD 7670M GFX. xserver-xorg-video-ati does works but NO 3D support and graphics are very low. [I am currently using this] fglrx & fglrx-updates shows blank screen after Login. Intel Graphic Install doesn't work either (dependency error) Output of $ sudo lshw -c video *-display description: VGA compatible controller product: Thames [Radeon HD 7500M/7600M Series] vendor: Advanced Micro Devices, Inc. [AMD/ATI] physical id: 0 bus info: pci@0000:01:00.0 version: 00 width: 64 bits clock: 33MHz capabilities: pm pciexpress msi vga_controller bus_master cap_list rom configuration: driver=radeon latency=0 resources: irq:45 memory:e0000000-efffffff memory:c0120000-c013ffff ioport:3000(size=256) memory:c0100000-c011ffff *-display description: VGA compatible controller product: 3rd Gen Core processor Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 09 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:46 memory:bfc00000-bfffffff memory:d0000000-dfffffff ioport:4000(size=64) Similarly $ /usr/lib/nux/unity_support_test -p OpenGL vendor string: VMware, Inc. OpenGL renderer string: Gallium 0.4 on llvmpipe (LLVM 3.3, 256 bits) OpenGL version string: 2.1 Mesa 9.2.1 Not software rendered: no Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no Shows no 3D support. Can anyone please guide how to make things working again.

    Read the article

  • LibGDX onTouch() method Array and flip method

    - by johnny-b
    How can I add this on my application. i want to use the onTouch() method from the implementation of the InputProcessor to kill the enemies on screen. how do i do that? do i have to do anything to the enemy class? also i am trying to add a Array of enemies and it keeps throwing exceptions or the bullet now is facing LEFT <--- again after I used the flip method in the bullet class. All the code is below so please anyone feel free to have a look thanks. please help Thank you M // This is the bullet class. public class Bullet extends Sprite { public static final float BULLET_HOMING = 6000; public static final float BULLET_SPEED = 300; private Vector2 velocity; private float lifetime; private Rectangle bul; public Bullet(float x, float y) { velocity = new Vector2(0, 0); setPosition(x, y); AssetLoader.bullet1.flip(true, false); AssetLoader.bullet2.flip(true, false); setSize(AssetLoader.bullet1.getWidth(), AssetLoader.bullet1.getHeight()); bul = new Rectangle(); } public void update(float delta) { float targetX = GameWorld.getBall().getX(); float targetY = GameWorld.getBall().getY(); float dx = targetX - getX(); float dy = targetY - getY(); float distToTarget = (float) Math.sqrt(dx * dx + dy * dy); dx /= distToTarget; dy /= distToTarget; dx *= BULLET_HOMING; dy *= BULLET_HOMING; velocity.x += dx * delta; velocity.y += dy * delta; float vMag = (float) Math.sqrt(velocity.x * velocity.x + velocity.y * velocity.y); velocity.x /= vMag; velocity.y /= vMag; velocity.x *= BULLET_SPEED; velocity.y *= BULLET_SPEED; bul.set(getX(), getY(), getOriginX(), getOriginY()); Vector2 v = velocity.cpy().scl(delta); setPosition(getX() + v.x, getY() + v.y); setOriginCenter(); setRotation(velocity.angle()); } public Rectangle getBounds() { return bul; } public Rectangle getBounds1() { return this.getBoundingRectangle(); } } // This is the class where i load all the images from public class AssetLoader { public static Texture texture; public static TextureRegion bg, ball1, ball2; public static Animation bulletAnimation, ballAnimation; public static Sprite bullet1, bullet2; public static void load() { texture = new Texture(Gdx.files.internal("SpriteN1.png")); texture.setFilter(TextureFilter.Nearest, TextureFilter.Nearest); bg = new TextureRegion(texture, 80, 421, 395, 30); bg.flip(false, true); ball1 = new TextureRegion(texture, 0, 321, 32, 32); ball1.flip(false, true); ball2 = new TextureRegion(texture, 32, 321, 32, 32); ball2.flip(false, true); bullet1 = new Sprite(texture, 380, 350, 45, 20); bullet1.flip(false, true); bullet2 = new Sprite(texture, 425, 350, 45, 20); bullet2.flip(false, true); TextureRegion[] balls = { ball1, ball2 }; ballAnimation = new Animation(0.16f, balls); ballAnimation.setPlayMode(Animation.PlayMode.LOOP); } Sprite[] bullets = { bullet1, bullet2 }; bulletAnimation = new Animation(0.06f, aims); bulletAnimation.setPlayMode(Animation.PlayMode.LOOP); } public static void dispose() { texture.dispose(); } // This is for the rendering or drawing onto the screen/canvas. public class GameRenderer { private Bullet bullet; private Ball ball; public GameRenderer(GameWorld world) { myWorld = world; cam = new OrthographicCamera(); cam.setToOrtho(true, 480, 320); batcher = new SpriteBatch(); // Attach batcher to camera batcher.setProjectionMatrix(cam.combined); shapeRenderer = new ShapeRenderer(); shapeRenderer.setProjectionMatrix(cam.combined); // Call helper methods to initialize instance variables initGameObjects(); initAssets(); } private void initGameObjects() { ball = GameWorld.getBall(); bullet = myWorld.getBullet(); scroller = myWorld.getScroller(); } private void initAssets() { ballAnimation = AssetLoader.ballAnimation; bulletAnimation = AssetLoader.bulletAnimation; } public void render(float runTime) { Gdx.gl.glClearColor(0, 0, 0, 1); Gdx.gl.glClear(GL30.GL_COLOR_BUFFER_BIT); batcher.begin(); batcher.disableBlending(); batcher.enableBlending(); batcher.draw(AssetLoader.ballAnimation.getKeyFrame(runTime), ball.getX(), ball.getY(), ball.getWidth(), ball.getHeight()); batcher.draw(AssetLoader.bulletAnimation.getKeyFrame(runTime), bullet.getX(), bullet.getY(), bullet.getOriginX(), bullet.getOriginY(), bullet.getWidth(), bullet.getHeight(), 1.0f, 1.0f, bullet.getRotation()); // End SpriteBatch batcher.end(); } } // this is to load the image etc on the screen i guess public class GameWorld { public static Ball ball; private Bullet bullet; private ScrollHandler scroller; public GameWorld() { ball = new Ball(480, 273, 32, 32); bullet = new Bullet(10, 10); scroller = new ScrollHandler(0); } public void update(float delta) { ball.update(delta); bullet.update(delta); scroller.update(delta); } public static Ball getBall() { return ball; } public ScrollHandler getScroller() { return scroller; } public Bullet getBullet() { return bullet; } } //This is the input handler class public class InputHandler implements InputProcessor { private Ball myBall; private Bullet bullet; private GameRenderer aims; // Ask for a reference to the Soldier when InputHandler is created. public InputHandler(Ball ball) { myBall = ball; } @Override public boolean touchDown(int screenX, int screenY, int pointer, int button) { return false; } @Override public boolean keyDown(int keycode) { return false; } @Override public boolean keyUp(int keycode) { return false; } @Override public boolean keyTyped(char character) { return false; } @Override public boolean touchUp(int screenX, int screenY, int pointer, int button) { return false; } @Override public boolean touchDragged(int screenX, int screenY, int pointer) { return false; } @Override public boolean mouseMoved(int screenX, int screenY) { return false; } @Override public boolean scrolled(int amount) { return false; } } i am rendering all graphics in a GameRender class and a gameworld class if you need more info please let me know I am trying to make the array work but keep finding that when an array is initialized then the bullet fips back to the original and ends up being backwards???? and if I create an array I keep getting Exceptions throw??? Thank you for any help given.

    Read the article

  • AME : How to Diagnose Issues With the Default Approver List in Purchasing When Using Approvals Management

    - by Oracle_EBS
    Do you need help in understanding the concepts or how to setup the Approval Management Engine (AME) for requisition approvals? See the new diagnostic Note 1437183.1 'AME : How to Diagnose Issues With the Default Approver List in Purchasing When Using Approvals Management'. AME is designed to generate the approval list according to the conditions and rules you define in the setup. This troubleshooting guide will help you understand how AME builds the default approval list for Purchasing and help users find solutions for scenarios where the approval list fails to be generated. Follow along with the logical steps for troubleshooting.  The note first reviews how to generate the AME Setup report.   For example in the note we see a fragment of the setup report. Notice it has different sections for each one of the setup categories including attributes, conditions, rules, action types, approval groups etc.  How the default approval list is built in AME is then reviewed, followed by the logical steps for diagnosing issues.  The diagnostic steps include how to run the Test Workbench, as well as how to obtain valuable debug and exception information.  Then follow along using the steps to build a simple test case to sharpen your understanding.

    Read the article

  • wireless blocked after installing ubuntu 12.04

    - by Cornelia Frank
    I am using a lenovo S10-3 ideapad; had no problems with earlier version of ubuntu, only since installing 12.04. Have looked through many of the questions on the same issue and tried potential solutions but cannot seem to solve my problem. The hardware switch is in 'on' position and the wireless light comes on very briefly (2-3 sec) when the laptop starts up but then goes off and stays off. Pressing FN+F5 does nothing at all. I'd be grateful for any assistance. Cornelia Have received the following responses in Terminal: cf@cf-Lenovo:~$ rfkill list all 0: ideapad_wlan: Wireless LAN Soft blocked: no Hard blocked: no 1: ideapad_bluetooth: Bluetooth Soft blocked: no Hard blocked: no 2: phy0: Wireless LAN Soft blocked: no Hard blocked: yes cf@cf-Lenovo:~$ iwconfig lo no wireless extensions. wlan0 IEEE 802.11bgn ESSID:off/any Mode:Managed Access Point: Not-Associated Tx-Power=off Retry long limit:7 RTS thr:off Fragment thr:off Power Management:off eth0 no wireless extensions. cf@cf-Lenovo:~$ lshw -C network WARNING: you should run this program as super-user. *-network description: Ethernet interface product: RTL8101E/RTL8102E PCI Express Fast Ethernet controller vendor: Realtek Semiconductor Co., Ltd. physical id: 0 bus info: pci@0000:05:00.0 logical name: eth0 version: 02 serial: 00:26:9e:ee:7f:4c size: 100Mbit/s capacity: 100Mbit/s width: 64 bits clock: 33MHz capabilities: bus_master cap_list rom ethernet physical tp mii 10bt 10bt-fd 100bt 100bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=r8169 driverversion=2.3LK-NAPI duplex=full firmware=N/A ip=10.0.1.8 latency=0 multicast=yes port=MII speed=100Mbit/s resources: irq:43 ioport:2000(size=256) memory:f0520000-f0520fff memory:f0510000-f051ffff memory:f0540000-f055ffff *-network DISABLED description: Wireless interface product: AR9285 Wireless Network Adapter (PCI-Express) vendor: Atheros Communications Inc. physical id: 0 bus info: pci@0000:09:00.0 logical name: wlan0 version: 01 serial: c4:17:fe:f8:bc:d7 width: 64 bits clock: 33MHz capabilities: bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=ath9k driverversion=3.2.0-31-generic-pae firmware=N/A latency=0 multicast=yes wireless=IEEE 802.11bgn resources: irq:18 memory:f0100000-f010ffff WARNING: output may be incomplete or inaccurate, you should run this program as super-user.

    Read the article

  • Adding root bone in 3DS Max?

    - by carlturtle
    my animation artist has made me a nice first person pair of arms, animated it, textured it, and given it to me. Then he went on vacation. I am programming my animations, and I am trying to test the model he has given me. Building my project gives me a warning: Multiple skeletons were found in the file. The first skeleton, named "frame l upperarm" has been moved to be a child of the scene root. The other, "frame r upperarm", will be ignored. Fragment identifier "frame r upperarm". Then an error: "Vertex is bound to bone "frame l forearm", but this bone is not present in the skeleton." I realize this means that there are two skeletons, as said in this problem: Importing 3d model with multiple skeletons I have 3DS Max, but I have no idea how to use it, and Google/CGTalk/Plycount turn up nothing relevant on how to add a root bone or combine skeletons. If anyone knows how, it would help me out greatly. Thanks.

    Read the article

  • Unity on Ubuntu 11.10 - The Dash Home button brings up the panel, but is empty

    - by David M. Coe
    The dash home button brings up a panel that is greyed out, but it is totally empty. It seems to be the very same issue as this: Dash home button brings up blank window which is unanswered. /usr/lib/nux/unity_support_test -p returns OpenGL vendor string: X.Org R300 Project OpenGL renderer string: Gallium 0.4 on ATI RV370 OpenGL version string: 2.1 Mesa 7.11 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes I've tried a unity --reset but that doesn't seem to work. Unity seems to reset, but I get the following warning over and over: cs space validation failed unity What should I do next to try and fix this? Edit: Attempted fixes: I've refomatted, did not work. I've done apt-get remove unity then apt-get update then apt-get install unity, did not work. I've switched to Unity 2d and this seems to work. How can I get regualar Unity working or atleast find the error?

    Read the article

  • Wireless keeps disabling or stays disconnected (Realtek RTL8191SEvB)

    - by jindrichm
    I have Realtek RTL8191SEvB wireless card on Ubuntu 10.10: $ lspci -v | grep Network 03:00.0 Network controller: Realtek Semiconductor Co., Ltd. RTL8191SEvB Wireless LAN Controller (rev 10) When I load its driver, according to the Network Manager it sometimes blinks with a list of available networks but it keeps disabling itself or it stays disconnected. So, I can't connect to any wi-fi network (which results in frustration). The driver is loaded: $ lsmod Module Size Used by r8192se_pci 509932 0 Looks normal: $ sudo lshw -C network *-network description: Wireless interface product: RTL8191SEvB Wireless LAN Controller vendor: Realtek Semiconductor Co., Ltd. physical id: 0 bus info: pci@0000:03:00.0 logical name: wlan0 version: 10 serial: 1c:65:9d:60:c7:7a width: 32 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=rtl819xSE driverversion=0019.1207.2010 firmware=63 latency=0 link=no multicast=yes wireless=802.11bgn resources: irq:17 ioport:2000(size=256) memory:f0500000-f0503fff Configured: $ sudo iwconfig wlan0 wlan0 802.11bgn Nickname:"rtl8191SEVA2" Mode:Managed Frequency=2.412 GHz Access Point: Not-Associated Bit Rate:130 Mb/s Retry:on RTS thr:off Fragment thr:off Encryption key:off Power Management:off Link Quality=10/100 Signal level=0 dBm Noise level=-100 dBm Rx invalid nwid:0 Rx invalid crypt:0 Rx invalid frag:0 Tx excessive retries:0 Invalid misc:0 Missed beacon:0 Is not blocked: $ rfkill list all 0: tpacpi_bluetooth_sw: Bluetooth Soft blocked: no Hard blocked: yes However something's happening with it: $ dmesg [ 6485.948668] InitializeAdapter8190(): ==++==> Turn off RF for RfOffReason(1073741824) ---------- [ 6486.062666] rtl8192_SetWirelessMode(), wireless_mode:10, bEnableHT = 1 [ 6486.062671] InitializeAdapter8192SE(): Set MRC settings on as default!! [ 6486.062675] HW_VAR_MRC: Turn on 1T1R MRC! [ 6486.064091] ADDRCONF(NETDEV_UP): wlan0: link is not ready [ 6486.248761] rtl8192_SetWirelessMode(), wireless_mode:10, bEnableHT = 1 [ 6486.248771] InitializeAdapter8192SE(): Set MRC settings on as default!! [ 6486.248776] HW_VAR_MRC: Turn on 1T1R MRC! [ 6486.580083] GPIOChangeRF - HW Radio OFF [ 6486.610085] ============>sync_scan_hurryup out [ 6486.623814] ================>r8192_wx_set_scan(): hwradio off [ 6486.830484] =========>r8192_wx_set_essid():hw radio off,or Rf state is eRfOff, return So, does anyone know where the problem might be?

    Read the article

  • Graphics hardware warning when updating to 14.04

    - by pacomet
    As I use Ubuntu at work I just update only LTS versions but now I'm not sure if I can/should. As my computer is now ten years old I would change if was mine but as it is owned by my employer I have to work with it. It's not a bad one, it runs fine (this was not true when still had Windows on it ;-). When updating to 14.04, it warns about possible bad/slow performance with Unity 3D so I stop updating as I am at work, not my own computer. As I understand from http://askubuntu.com/a/438958/25305 Nvidia Geforce FX 5500 graphics card is still supported in 14.04. Now, in 12.04, I have driver version 173 and unity 2d runs fine for me. output of /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce FX 5500/AGP/SSE2 OpenGL version string: 2.1.2 NVIDIA 173.14.39 Not software rendered: yes Not blacklisted: no GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no Should I update? Is it better to stay with 12.04? Thanks

    Read the article

  • OpenGL depth texture wrong

    - by CoffeeandCode
    I have been writing a game engine for a while now and have decided to reconstruct my positions from depth... but how I read the depth seems to be wrong :/ What is wrong in my rendering? How I init my depth texture in the FBO gl::BindTexture(gl::TEXTURE_2D, this->textures[0]); // Depth gl::TexImage2D( gl::TEXTURE_2D, 0, gl::DEPTH32F_STENCIL8, width, height, 0, gl::DEPTH_STENCIL, gl::FLOAT_32_UNSIGNED_INT_24_8_REV, nullptr ); gl::TexParameterf(gl::TEXTURE_2D, gl::TEXTURE_MAG_FILTER, gl::NEAREST); gl::TexParameterf(gl::TEXTURE_2D, gl::TEXTURE_MIN_FILTER, gl::NEAREST); gl::TexParameterf(gl::TEXTURE_2D, gl::TEXTURE_WRAP_S, gl::CLAMP_TO_EDGE); gl::TexParameterf(gl::TEXTURE_2D, gl::TEXTURE_WRAP_T, gl::CLAMP_TO_EDGE); gl::FramebufferTexture2D( gl::FRAMEBUFFER, gl::DEPTH_STENCIL_ATTACHMENT, gl::TEXTURE_2D, this->textures[0], 0 ); Linear depth readings in my shader Vertex #version 150 layout(location = 0) in vec3 position; layout(location = 1) in vec2 uv; out vec2 uv_f; void main(){ uv_f = uv; gl_Position = vec4(position, 1.0); } Fragment (where the issue probably is) #version 150\n uniform sampler2D depth_texture; in vec2 uv_f; out vec4 Screen; void main(){ float n = 0.00001; float f = 100.0; float z = texture(depth_texture, uv_f).x; float linear_depth = (n * z)/(f - z * (f - n)); Screen = vec4(linear_depth); // It ISN'T because I don't separate alpha } When Rendered so gamedev.stackexchange, what's wrong with my rendering/glsl?

    Read the article

  • How to know my wireless card has injection enabled?

    - by shrimpy
    I am playing around with aircrack. And was trying to see whether my wireless card on my laptop can pass the injection test And I end up seeing the following... does it mean my wireless card is not able to run aircrack? root@myubuntu:/home/myubuntu# iwconfig lo no wireless extensions. eth0 no wireless extensions. eth1 IEEE 802.11bg ESSID:"" Nickname:"" Mode:Managed Frequency:2.437 GHz Access Point: Not-Associated Bit Rate:54 Mb/s Tx-Power:24 dBm Retry min limit:7 RTS thr:off Fragment thr:off Power Management:off Link Quality=5/5 Signal level=0 dBm Noise level=-57 dBm Rx invalid nwid:0 Rx invalid crypt:781 Rx invalid frag:0 Tx excessive retries:0 Invalid misc:0 Missed beacon:0 root@myubuntu:/home/myubuntu# aireplay-ng -9 eth1 ioctl(SIOCSIWMODE) failed: Invalid argument ARP linktype is set to 1 (Ethernet) - expected ARPHRD_IEEE80211, ARPHRD_IEEE80211_FULL or ARPHRD_IEEE80211_PRISM instead. Make sure RFMON is enabled: run 'airmon-ng start eth1 <#>' Sysfs injection support was not found either. root@myubuntu:/home/myubuntu#

    Read the article

  • Accessing localhost (xampp) from another computer over LAN network - how to?

    - by vr3690
    I have just set up a wi-fi network at home. I have all my files on my desktop computer (192.168.1.56) and want to access localhost over there from another computer (192.168.1.2). On my desktop I can access localhost through the normal http://localhost. Apache is running on port 80 as usual. Exactly what do I have to do to achieve this? There is documentation on the net but they either don't work or are too fragment and confusing to understand. I think I have to make changes to my apache's httpd.conf file and the hosts file. Any ideas as to what changes to make?

    Read the article

  • Lenovo v570 ubuntu 12.04 wireless hard blocked even when ext. switch is on

    - by user100987
    When I run iwconfig it say's lo and eth0 have no wireless extensions, but wlan0 it says IEEE 802.11bgn ESSID:off/any Mode:Managed Access Point: Not-Associated Tx-Power=off Retry long limit:7 RTS the:off Fragment the:off Power Management:off and I believe that's my problem, I just don't know how to turn it back on Any help? When I ran lspci | grep Network it gave me this 02:00.0 Network controller: Intell Corporation Centrino Wireless -N + WiMAX 6150 (rev 67) How I know that my wireless is hard blocked because when I run sudo rfkill list all I get 0: Ideapad_wlan: Wireless LAN Soft blocked: no Hard blocked: no 1: phy0: Wireless LAN Soft blocked: no Hard blocked: yes when I run lshw -c network I get. *-network DISABLED description: Wireless interface product: Centrino Wireless-N + WiMAX 6150 vendor: Intel Corporation physical id: 0 bus info: pci@0000:02:00.0 logical name: mon1 version: 67 serial: 40:25:c2:d2:96:2c width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list logical wireless ethernet physical configuration: broadcast=yes driver=iwlwifi driverversion=3.2.0-32-generic-pae firmware=41.28.5.1 build 33926 latency=0 link=no multicast=yes wireless=IEEE 802.11bgn resources: irq:43 memory:d0500000-d0501fff *-network description: Ethernet interface product: RTL8111/8168B PCI Express Gigabit Ethernet controller vendor: Realtek Semiconductor Co., Ltd. physical id: 0 bus info: pci@0000:03:00.0 logical name: eth0 version: 06 serial: f0:de:f1:d7:a0:4d size: 100Mbit/s capacity: 1Gbit/s width: 64 bits clock: 33MHz capabilities: pm msi pciexpress msix vpd bus_master cap_list ethernet physical tp mii 10bt 10bt-fd 100bt 100bt-fd 1000bt 1000bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=r8169 driverversion=2.3LK-NAPI duplex=full firmware=rtl_nic/rtl8168e-2.fw ip=192.168.0.65 latency=0 link=yes multicast=yes port=MII speed=100Mbit/s resources: irq:41 ioport:2000(size=256) memory:d0404000-d0404fff memory:d0400000-d0403fff

    Read the article

  • Shader compile log depending on hardware

    - by dreta
    I'm done with the core of my graphics engine and I'm testing it on every platform I can get my hands on. Now, what I noticed is that different drivers return different shader and program compile log content. For example, on my friend's laptop if you successfuly compile a shader then the log is simply empty. However on my PC I get some useful information along with it. So if I compile a vertex shader, I'll get: Vertex shader was successfully compiled to run on hardware. Which isn't that impressive, but is what happens when I compile a program. On my friend's computer the log is empty, since the program compiles. However on my own computer I get: Vertex shader(s) linked, fragment shader(s) linked. Which is awesome, because I'm attaching a geometry shader with 0 (I have a geometry shader file with trash, so it doesn't compile and the pointer is set to 0), and the compiler just tells me which shaders linked. Now it got me thinking, if I was going to buy a graphics card, is there a way for me to get the information about whether or not I'll get this "extended" compile information? Maybe it's vendor specific? Now I don't expect an answer TBH, this seems a bit obscure, but maybe somebody has any experience with this and could post it.

    Read the article

  • How to open a server port outside of an OpenVPN tunnel with a pf firewall on OSX (BSD)

    - by Timbo
    I have a Mac mini that I use as a media server running XBMC and serves media from my NAS to my stereo and TV (which has been color calibrated with a Spyder3Express, happy). The Mac runs OSX 10.8.2 and the internet connection is tunneled for general privacy over OpenVPN through Tunnelblick. I believe my anonymous VPN provider pushes "redirect_gateway" to OpenVPN/Tunnelblick because when on it effectively tunnels all non-LAN traffic in- and outbound. As an unwanted side effect that also opens the boxes server ports unprotected to the outside world and bypasses my firewall-router (Netgear SRX5308). I have run nmap from outside the LAN on the VPN IP and the server ports on the mini are clearly visible and connectable. The mini has the following ports open: ssh/22, ARD/5900 and 8080+9090 for the XBMC iOS client Constellation. I also have Synology NAS which apart from LAN file serving over AFP and WebDAV only serves up an OpenVPN/1194 and a PPTP/1732 server. When outside of the LAN I connect to this from my laptop over OpenVPN and over PPTP from my iPhone. I only want to connect through AFP/548 from the mini to the NAS. The border firewall (SRX5308) just works excellently, stable and with a very high throughput when streaming from various VOD services. My connection is a 100/10 with a close to theoretical max throughput. The ruleset is as follows Inbound: PPTP/1723 Allow always to 10.0.0.40 (NAS/VPN server) from a restricted IP range >corresponding to possible cell provider range OpenVPN/1194 Allow always to 10.0.0.40 (NAS/VPN server) from any Outbound: Default outbound policy: Allow Always OpenVPN/1194 TCP Allow always from 10.0.0.40 (NAS) to a.b.8.1-a.b.8.254 (VPN provider) OpenVPN/1194 UDP Allow always to 10.0.0.40 (NAS) to a.b.8.1-a.b.8.254 (VPN provider) Block always from NAS to any On the Mini I have disabled the OSX Application Level Firewall because it throws popups which don't remember my choices from one time to another and that's annoying on a media server. Instead I run Little Snitch which controls outgoing connections nicely on an application level. I have configured the excellent OSX builtin firewall pf (from BSD) as follows pf.conf (Apple App firewall tie-ins removed) (# replaced with % to avoid formatting errors) ### macro name for external interface. eth_if = "en0" vpn_if = "tap0" ### wifi_if = "en1" ### %usb_if = "en3" ext_if = $eth_if LAN="{10.0.0.0/24}" ### General housekeeping rules ### ### Drop all blocked packets silently set block-policy drop ### all incoming traffic on external interface is normalized and fragmented ### packets are reassembled. scrub in on $ext_if all fragment reassemble scrub in on $vpn_if all fragment reassemble scrub out all ### exercise antispoofing on the external interface, but add the local ### loopback interface as an exception, to prevent services utilizing the ### local loop from being blocked accidentally. ### set skip on lo0 antispoof for $ext_if inet antispoof for $vpn_if inet ### spoofing protection for all interfaces block in quick from urpf-failed ############################# block all ### Access to the mini server over ssh/22 and remote desktop/5900 from LAN/en0 only pass in on $eth_if proto tcp from $LAN to any port {22, 5900, 8080, 9090} ### Allow all udp and icmp also, necessary for Constellation. Could be tightened. pass on $eth_if proto {udp, icmp} from $LAN to any ### Allow AFP to 10.0.0.40 (NAS) pass out on $eth_if proto tcp from any to 10.0.0.40 port 548 ### Allow OpenVPN tunnel setup over unprotected link (en0) only to VPN provider IPs ### and port ranges pass on $eth_if proto tcp from any to a.b.8.0/24 port 1194:1201 ### OpenVPN Tunnel rules. All traffic allowed out, only in to ports 4100-4110 ### Outgoing pings ok pass in on $vpn_if proto {tcp, udp} from any to any port 4100:4110 pass out on $vpn_if proto {tcp, udp, icmp} from any to any So what are my goals and what does the above setup achieve? (until you tell me otherwise :) 1) Full LAN access to the above ports on the mini/media server (including through my own VPN server) 2) All internet traffic from the mini/media server is anonymized and tunneled over VPN 3) If OpenVPN/Tunnelblick on the mini drops the connection, nothing is leaked both because of pf and the router outgoing ruleset. It can't even do a DNS lookup through the router. So what do I have to hide with all this? Nothing much really, I just got carried away trying to stop port scans through the VPN tunnel :) In any case this setup works perfectly and it is very stable. The Problem at last! I want to run a minecraft server and I installed that on a separate user account on the mini server (user=mc) to keep things partitioned. I don't want this server accessible through the anonymized VPN tunnel because there are lots more port scans and hacking attempts through that than over my regular IP and I don't trust java in general. So I added the following pf rule on the mini: ### Allow Minecraft public through user mc pass in on $eth_if proto {tcp,udp} from any to any port 24983 user mc pass out on $eth_if proto {tcp, udp} from any to any user mc And these additions on the border firewall: Inbound: Allow always TCP/UDP from any to 10.0.0.40 (NAS) Outbound: Allow always TCP port 80 from 10.0.0.40 to any (needed for online account checkups) This works fine but only when the OpenVPN/Tunnelblick tunnel is down. When up no connection is possbile to the minecraft server from outside of LAN. inside LAN is always OK. Everything else functions as intended. I believe the redirect_gateway push is close to the root of the problem, but I want to keep that specific VPN provider because of the fantastic throughput, price and service. The Solution? How can I open up the minecraft server port outside of the tunnel so it's only available over en0 not the VPN tunnel? Should I a static route? But I don't know which IPs will be connecting...stumbles How secure would to estimate this setup to be and do you have other improvements to share? I've searched extensively in the last few days to no avail...If you've read this far I bet you know the answer :)

    Read the article

  • Ubuntu Wireless not working on Lenovo t400

    - by VmaxBoss
    This problem started after upgrading to 12.04, an my system is 'up2date' Have tried most of the solution-proposals found on the net. lspci -nnk | grep -iA2 net 00:19.0 Ethernet controller [0200]: Intel Corporation 82567LF Gigabit Network Connection [8086:10bf] (rev 03) Subsystem: Lenovo Device [17aa:20ee] Kernel driver in use: e1000e 03:00.0 Network controller [0280]: Intel Corporation PRO/Wireless 5100 AGN [Shiloh] Network Connection [8086:4237] Subsystem: Intel Corporation WiFi Link 5100 AGN [8086:1211] Kernel driver in use: iwlagn iwconfig lo no wireless extensions. eth0 no wireless extensions. wlan0 IEEE 802.11abgn ESSID:off/any Mode:Managed Access Point: Not-Associated Tx-Power=15 dBm Retry long limit:7 RTS thr:off Fragment thr:off Encryption key:off Power Management:off sudo lshw -C network *-network description: Ethernet interface product: 82567LF Gigabit Network Connection vendor: Intel Corporation physical id: 19 bus info: pci@0000:00:19.0 logical name: eth0 version: 03 serial: 00:22:68:1a:c4:75 size: 100Mbit/s capacity: 1Gbit/s width: 32 bits clock: 33MHz capabilities: pm msi bus_master cap_list ethernet physical tp 10bt 10bt-fd 100bt 100bt-fd 1000bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=e1000e driverversion=1.0.2-k2 duplex=full firmware=1.8-3 ip=192.168.2.154 latency=0 link=yes multicast=yes port=twisted pair speed=100Mbit/s resources: irq:29 memory:fc000000-fc01ffff memory:fc024000-fc024fff ioport:1820(size=32) *-network DISABLED description: Wireless interface product: PRO/Wireless 5100 AGN [Shiloh] Network Connection vendor: Intel Corporation physical id: 0 bus info: pci@0000:03:00.0 logical name: wlan0 version: 00 serial: 00:26:c6:6c:2d:24 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=iwlagn latency=0 multicast=yes wireless=IEEE 802.11abgn resources: irq:30 memory:f4300000-f4301fff Please help Br/VB

    Read the article

  • GLSL custom interpolation filter

    - by Cyan
    I'm currently building a fragment shader which is using several textures to render the final pixel color. The textures are not really textures, they are in fact "input data" to be used in the formula to generate the final color. The problem I've got is that the texture are getting bi-linear-filtered, and therefore the input data as well. This results in many unwanted side-effects, especially when final rendered texture is "zoomed" compared to original resolution. Removing the side effect is a complex task, and only result in "average" rendering. I was thinking : well, all my problems seems to come from the "default" bi-linear filtering on these input data. I can't move to GL_NEAREST either, since it would create "blocky" rendering. So i guess the better way to proceed is to be fully in charge of the interpolation. For this to work, i would need the input data at their "natural" resolution (so that means 4 samples), and a relative position between the sampled points. Is that possible, and if yes, how ? [EDIT] Since i started this question, i found this internet entry, which seems to (mostly) answer my needs. http://www.gamerendering.com/2008/10/05/bilinear-interpolation/ One aspect of the solution worry me though : the dimensions of the texture must be provided in an argument. It seems there is no way to "find this information transparently". Adding an argument into the rendering pipeline is unwelcomed though, since it's not under my responsibility, and translates into adding complexity for others.

    Read the article

  • How to setup my texture cordinates correctly in GLSL 150 and OpenGL 3.3?

    - by RubyKing
    I'm trying to do texture mapping in GLSL 150 and OpenGL 3.3 Here are my shaders I've tried my best to get this correct as possible hopefully this is :) I'm guessing you want to know what the problem is well my texture shows but not in its fullest form just one section of it not the full texture on the quad. All I can think of is its the texture cordinates in the main.cpp which is at the bottom of this post. FRAGMENT SHADER #version 150 in vec2 Texcoord_VSPS; out vec4 color; // Values that stay constant for the whole mesh. uniform sampler2D myTextureSampler; //Main Entry Point void main() { // Output color = color of the texture at the specified UV color = texture2D( myTextureSampler, Texcoord_VSPS ); } VERTEX SHADER #version 150 //Position Container in vec3 position; //Container for TexCoords attribute vec2 Texcoord0; out vec2 Texcoord_VSPS; //out vec2 ex_texcoord; //TO USE A DIFFERENT COORDINATE SYSTEM JUST MULTIPLY THE MATRIX YOU WANT //Main Entry Point void main() { //Translations and w Cordinates stuff gl_Position = vec4(position.xyz, 1.0); Texcoord_VSPS = Texcoord0; } LINK TO MAIN.CPP http://pastebin.com/t7Vg9L0k

    Read the article

  • Wireless problems with Broadcom BCM4313 with windows7 (duplicate)

    - by user292394
    I have wireless connection problem can't connect my wireless adress lspci -vvnn | grep 14e4 03:00.0 Network controller [0280]: Broadcom Corporation BCM4313 802.11bgn Wireless Network Adapter [14e4:4727] (rev 01) iwconfig eth0 no wireless extensions. lo no wireless extensions. wlan0 IEEE 802.11abg ESSID:off/any Mode:Managed Access Point: Not-Associated Retry long limit:7 RTS thr:off Fragment thr:off Power Management:off rfkill list all 0: hci0: Bluetooth Soft blocked: no Hard blocked: no 1: phy0: Wireless LAN Soft blocked: no Hard blocked: no 2: brcmwl-0: Wireless LAN Soft blocked: no Hard blocked: no lsmod Module Size Used by bnep 19624 2 rfcomm 69160 8 ip6t_REJECT 12939 1 xt_hl 12521 6 ip6t_rt 13537 3 nf_conntrack_ipv6 18894 8 nf_defrag_ipv6 34768 1 nf_conntrack_ipv6 ipt_REJECT 12541 1 xt_LOG 17717 10 xt_limit 12711 13 xt_tcpudp 12884 18 xt_addrtype 12635 4 nf_conntrack_ipv4 15012 8 nf_defrag_ipv4 12758 1 nf_conntrack_ipv4 xt_conntrack 12760 16 ip6table_filter 12815 1 ip6_tables 27025 1 ip6table_filter nf_conntrack_netbios_ns 12665 0 nf_conntrack_broadcast 12589 1 nf_conntrack_netbios_ns nf_nat_ftp 12770 0 nf_nat 21798 1 nf_nat_ftp nf_conntrack_ftp 18638 1 nf_nat_ftp nf_conntrack 96976 8 nf_nat_ftp,nf_conntrack_netbios_ns,nf_nat,xt_conntrack,nf_conntrack_broadcast,nf_conntrack_ftp,nf_conntrack_ipv4,nf_conntrack_ipv6 iptable_filter 12810 1 ip_tables 27239 1 iptable_filter vx_tables 34059 13 ip6table_filter,xt_hl,ip_tables,xt_tcpudp,xt_limit,xt_conntrack,xt_LOG,iptable_filter,ip6t_rt,ipt_REJECT,ip6_tables,xt_addrtype,ip6t_REJECT uvcvideo 80885 0 videobuf2_vmalloc 13216 1 uvcvideo videobuf2_memops 13362 1 videobuf2_vmalloc snd_hda_codec_hdmi 46207 1 videobuf2_core 40664 1 uvcvideo videodev 134688 2 uvcvideo,videobuf2_core snd_hda_codec_conexant 57441 1 snd_hda_intel 52355 8 snd_hda_codec 192906 3 snd_hda_codec_hdmi,snd_hda_codec_conexant,snd_hda_intel snd_hwdep 13602 1 snd_hda_codec snd_pcm 102099 4 snd_hda_codec_hdmi,snd_hda_codec,snd_hda_intel snd_page_alloc 18710 2 snd_pcm,snd_hda_intel snd_seq_midi 13324 0 dm_multipath 22873 0 snd_seq_midi_event 14899 1 snd_seq_midi scsi_dh 14882 1 dm_multipath lib80211_crypt_tkip 17619 0 snd_rawmidi 30144 1 snd_seq_midi intel_powerclamp 14705 0 coretemp 13435 0 kvm_intel 143060 0 kvm 451511 1 kvm_intel joydev 17381 0 serio_raw 13462 0 snd_seq 61560 2 snd_seq_midi_event,snd_seq_midi intel_ips 18664 0 wl 4207846 0 snd_seq_device 14497 3 snd_seq,snd_rawmidi,snd_seq_midi btusb 32412 0 bluetooth 395423 22 bnep,btusb,rfcomm snd_timer 29482 2 snd_pcm,snd_seq snd 69238 26 snd_hwdep,snd_timer,snd_hda_codec_hdmi,snd_hda_codec_conexant,snd_pcm,snd_seq,snd_rawmidi,snd_hda_codec,snd_hda_intel,snd_seq_device,snd_seq_midi lib80211 14381 2 wl,lib80211_crypt_tkip toshiba_bluetooth 12852 0 cfg80211 484040 1 wl lpc_ich 21080 0 fglrx 8085343 190 soundcore 12680 1 snd toshiba_acpi 22901 0 sparse_keymap 13948 1 toshiba_acpi wmi 19177 1 toshiba_acpi amd_iommu_v2 19054 1 fglrx video 19476 0 mei_me 18627 0 mei 82276 1 mei_me mac_hid 13205 0 parport_pc 32701 0 ppdev 17671 0 lp 17759 0 parport 42348 3 lp,ppdev,parport_pc hid_generic 12548 0 usbhid 52570 0 hid 106148 2 hid_generic,usbhid psmouse 102222 0 ahci 25819 3 libahci 32168 1 ahci atl1c 46086 0 dm_mirror 22135 0 dm_region_hash 20862 1 dm_mirror dm_log 18411 2 dm_region_hash,dm_mirror Can someone give any ideas on how I can fix my wireless

    Read the article

  • File filtering_Python 3.2 [closed]

    - by user71261
    I'm trying to write a short file filtering code in Python that will find my desired string. I've got it worked out logically, but my Command Feed is sending me an error message for the print statement. This is how it works as of now: filename = input('give file name: ') n = input('give desired string: ') f = open line = f.readline() while line: if n in line: print line line = f.readline() Error Statement: Traceback (most recent call last): File "<string>", line 7, in <fragment> Syntax Error: print line: <string>, line 718 I know this is a simple problem but the answer is not obvious to me. please help.

    Read the article

  • Issue allowing custom Xml Serialization/Deserialization on certain types of field

    - by sw1sh
    I've been working with Xml Serialization/Deserialization in .net and wanted a method where the serialization/deserialization process would only be applied to certain parts of an Xml fragment. This is so I can keep certain parts of the fragment in Xml after the deserialization process. To do this I thought it would be best to create a new class (XmlLiteral) that implemented IXmlSerializable and then wrote specific code for handling the IXmlSerializable.ReadXml and IXmlSerializable.WriteXml methods. In my example below this works for Serializing, however during the Deserialization process it fails to run for multiple uses of my XmlLiteral class. In my example below sTest1 gets populated correctly, but sTest2 and sTest3 are empty. I'm guessing I must be going wrong with the following lines but can't figure out why.. Any ideas at all? Private Sub ReadXml(ByVal reader As System.Xml.XmlReader) Implements IXmlSerializable.ReadXml Dim StringType As String = "" If reader.IsEmptyElement OrElse reader.Read() = False Then Exit Sub End If _src = reader.ReadOuterXml() End Sub Full listing: Imports System Imports System.Xml.Serialization Imports System.Xml Imports System.IO Imports System.Text Public Class XmlLiteralExample Inherits System.Web.UI.Page Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load Dim MyObjectInstance As New MyObject MyObjectInstance.aProperty = "MyValue" MyObjectInstance.XmlLiteral1 = New XmlLiteral("<test1>Some Value</test1>") MyObjectInstance.XmlLiteral2 = New XmlLiteral("<test2>Some Value</test2>") MyObjectInstance.XmlLiteral3 = New XmlLiteral("<test3>Some Value</test3>") ' quickly serialize the object to Xml Dim sw As New StringWriter(New StringBuilder()) Dim s As New XmlSerializer(MyObjectInstance.[GetType]()), xmlnsEmpty As New XmlSerializerNamespaces xmlnsEmpty.Add("", "") s.Serialize(sw, MyObjectInstance, xmlnsEmpty) Dim XElement As XElement = XElement.Parse(sw.ToString()) ' XElement reads as the following, so serialization works OK '<MyObject> ' <aProperty>MyValue</aProperty> ' <XmlLiteral1> ' <test1>Some Value</test1> ' </XmlLiteral1> ' <XmlLiteral2> ' <test2>Some Value</test2> ' </XmlLiteral2> ' <XmlLiteral3> ' <test3>Some Value</test3> ' </XmlLiteral3> '</MyObject> ' quickly deserialize the object back to an instance of MyObjectInstance2 Dim MyObjectInstance2 As New MyObject Dim xmlReader As XmlReader, x As XmlSerializer xmlReader = XElement.CreateReader x = New XmlSerializer(MyObjectInstance2.GetType()) MyObjectInstance2 = x.Deserialize(xmlReader) Dim sProperty As String = MyObjectInstance2.aProperty ' equal to "MyValue" Dim sTest1 As String = MyObjectInstance2.XmlLiteral1.Text ' contains <test1>Some Value</test1> Dim sTest2 As String = MyObjectInstance2.XmlLiteral2.Text ' is empty Dim sTest3 As String = MyObjectInstance2.XmlLiteral3.Text ' is empty ' sTest3 and sTest3 should be populated but are not? xmlReader = Nothing End Sub Public Class MyObject Private _aProperty As String Private _XmlLiteral1 As XmlLiteral Private _XmlLiteral2 As XmlLiteral Private _XmlLiteral3 As XmlLiteral Public Property aProperty As String Get Return _aProperty End Get Set(ByVal value As String) _aProperty = value End Set End Property Public Property XmlLiteral1 As XmlLiteral Get Return _XmlLiteral1 End Get Set(ByVal value As XmlLiteral) _XmlLiteral1 = value End Set End Property Public Property XmlLiteral2 As XmlLiteral Get Return _XmlLiteral2 End Get Set(ByVal value As XmlLiteral) _XmlLiteral2 = value End Set End Property Public Property XmlLiteral3 As XmlLiteral Get Return _XmlLiteral3 End Get Set(ByVal value As XmlLiteral) _XmlLiteral3 = value End Set End Property Public Sub New() _XmlLiteral1 = New XmlLiteral _XmlLiteral2 = New XmlLiteral _XmlLiteral3 = New XmlLiteral End Sub End Class <System.Xml.Serialization.XmlRootAttribute(Namespace:="", IsNullable:=False)> _ Public Class XmlLiteral Implements IXmlSerializable Private _src As String Public Property Text() As String Get Return _src End Get Set(ByVal value As String) _src = value End Set End Property Public Sub New() _src = "" End Sub Public Sub New(ByVal Text As String) _src = Text End Sub #Region "IXmlSerializable Members" Private Function GetSchema() As System.Xml.Schema.XmlSchema Implements IXmlSerializable.GetSchema Return Nothing End Function Private Sub ReadXml(ByVal reader As System.Xml.XmlReader) Implements IXmlSerializable.ReadXml Dim StringType As String = "" If reader.IsEmptyElement OrElse reader.Read() = False Then Exit Sub End If _src = reader.ReadOuterXml() End Sub Private Sub WriteXml(ByVal writer As System.Xml.XmlWriter) Implements IXmlSerializable.WriteXml writer.WriteRaw(_src) End Sub #End Region End Class End Class

    Read the article

  • Problem when trying to use simple Shaders + VBOs

    - by Mr.Gando
    Hello I'm trying to convert the following functions to a VBO based function for learning purposes, it displays a static texture on screen. I'm using OpenGL ES 2.0 with shaders on the iPhone (should be almost the same than regular OpenGL in this case), this is what I got working: //Works! - (void) drawAtPoint:(CGPoint)point depth:(CGFloat)depth { GLfloat coordinates[] = { 0, 1, 1, 1, 0, 0, 1, 0 }; GLfloat width = (GLfloat)_width * _maxS, height = (GLfloat)_height * _maxT; GLfloat vertices[] = { -width / 2 + point.x, -height / 2 + point.y, width / 2 + point.x, -height / 2 + point.y, -width / 2 + point.x, height / 2 + point.y, width / 2 + point.x, height / 2 + point.y, }; glBindTexture(GL_TEXTURE_2D, _name); //Attrib position and attrib_tex coord are handles for the shader attributes glVertexAttribPointer(ATTRIB_POSITION, 2, GL_FLOAT, GL_FALSE, 0, vertices); glEnableVertexAttribArray(ATTRIB_POSITION); glVertexAttribPointer(ATTRIB_TEXCOORD, 2, GL_FLOAT, GL_FALSE, 0, coordinates); glEnableVertexAttribArray(ATTRIB_TEXCOORD); glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); } I tried to do this to convert to a VBO however I don't see anything displaying on-screen with this version: //Doesn't display anything - (void) drawAtPoint:(CGPoint)point depth:(CGFloat)depth { GLfloat width = (GLfloat)_width * _maxS, height = (GLfloat)_height * _maxT; GLfloat position[] = { -width / 2 + point.x, -height / 2 + point.y, width / 2 + point.x, -height / 2 + point.y, -width / 2 + point.x, height / 2 + point.y, width / 2 + point.x, height / 2 + point.y, }; //Texture on-screen position ( each vertex is x,y in on-screen coords ) GLfloat coordinates[] = { 0, 1, 1, 1, 0, 0, 1, 0 }; // Texture coords from 0 to 1 glBindVertexArrayOES(vao); glGenVertexArraysOES(1, &vao); glGenBuffers(2, vbo); //Buffer 1 glBindBuffer(GL_ARRAY_BUFFER, vbo[0]); glBufferData(GL_ARRAY_BUFFER, 8 * sizeof(GLfloat), position, GL_STATIC_DRAW); glEnableVertexAttribArray(ATTRIB_POSITION); glVertexAttribPointer(ATTRIB_POSITION, 2, GL_FLOAT, GL_FALSE, 0, position); //Buffer 2 glBindBuffer(GL_ARRAY_BUFFER, vbo[1]); glBufferData(GL_ARRAY_BUFFER, 8 * sizeof(GLfloat), coordinates, GL_DYNAMIC_DRAW); glEnableVertexAttribArray(ATTRIB_TEXCOORD); glVertexAttribPointer(ATTRIB_TEXCOORD, 2, GL_FLOAT, GL_FALSE, 0, coordinates); //Draw glBindVertexArrayOES(vao); glBindTexture(GL_TEXTURE_2D, _name); glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); } In both cases I'm using this simple Vertex Shader //Vertex Shader attribute vec2 position;//Bound to ATTRIB_POSITION attribute vec4 color; attribute vec2 texcoord;//Bound to ATTRIB_TEXCOORD varying vec2 texcoordVarying; uniform mat4 mvp; void main() { //You CAN'T use transpose before in glUniformMatrix4fv so... here it goes. gl_Position = mvp * vec4(position.x, position.y, 0.0, 1.0); texcoordVarying = texcoord; } The gl_Position is equal to product of mvp * vec4 because I'm simulating glOrthof in 2D with that mvp And this Fragment Shader //Fragment Shader uniform sampler2D sampler; varying mediump vec2 texcoordVarying; void main() { gl_FragColor = texture2D(sampler, texcoordVarying); } I really need help with this, maybe my shaders are wrong for the second case ? thanks in advance.

    Read the article

< Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >