Search Results

Search found 7346 results on 294 pages for 'touch flo 3d'.

Page 11/294 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Perf: Viewing thousands of images in Silverlight 3 on a 3D Wall

    - by Bob Holland
    I currently work on a very cool Silverlight app that displays photos in a 3D wall space like the Wall3D demo that is thrown in with Blend 3. The problem I am currently facing is performance. The app works like this: As you scroll right or left the 3d photo wall rotates As each movement is made, the next column of photos are downloaded, decoded into a BitmapImage and thrown into a 3D Wall Node. As you can imagine users (if you let them) will want to flip through the photos really quickly, but the problem I have is I cannot display the photos quick enough. In most cases it's a beautiful app that works really well, but when an album contains over 300 photos, you can imagine the sort of memory taken up by all the BitmapImage classes and how moving the slider can jump from photo 20 to photo 120 in a second. Of course we have algorithms in place to not download every photo in between, but I still can't work out a fast way to get the photos displayed. It may be a case that we need to throw away the 'great for show' 3D wall and go to a flat DeepZoom like wall like the Playboy archive one that Vertigo did. Still not sure, let me know your thoughts. P.S. We are using Kit3D for all the 3D work, it's using PerspectiveCamera, Model3DGroup, ModelVisual3D, RotateTransform3D & TranslateTransform3D. Cheers, Bob.

    Read the article

  • Passing touch events on to subviews

    - by Egil Jansson
    I have a view within a UIScrollView that loads an additional subview when the user presses a certain area. When this additional subview is visible, I want all touch events to be handled by this - and not by the scrollview. It seems like the first couple events are being handled by the subview, but then touchesCancelled is called and the scrollview takes over the touch detection. How can I make sure that the subview gets all the events as long as the movement activity is being performed on this view? This is my implementation on touchesMoved - which I thought would do the job... -(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[touches allObjects] objectAtIndex:0]; CGPoint touchPt = [touch locationInView:self]; UIView *hitView = [self hitTest:touchPt withEvent:event]; UIView *mySubView = subviewCtrl.view; if(hitView == mySubView) { [subviewCtrl.view touchesMoved:touches withEvent:event]; } else { NSLog(@"Outside of view..."); } }

    Read the article

  • Generate 2D cross-section polygon from 3D mesh

    - by nornagon
    I'm writing a game which uses 3D models to draw a scene (top-down orthographic projection), but a 2D physics engine to calculate response to collisions, etc. I have a few 3D assets for which I'd like to be able to automatically generate a hitbox by 'slicing' the 3D mesh with the X-Y plane and creating a polygon from the resultant edges. Google is failing me on this one (and not much helpful material on SO either). Suggestions?

    Read the article

  • Problems with CGPoint/touches event

    - by Jason
    I'm having some problems with storing variables from my touch events. The warning I get when I run this is that coord and icoord are unused, but I used them in the viewDidLoad implementation, is there a reason why this does not work? Any suggestions? #import "iGameViewController.h" @implementation iGameViewController @synthesize player; -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event allTouches] anyObject]; CGPoint icoord = [touch locationInView:touch.view]; } -(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event allTouches] anyObject]; CGPoint coord = [touch locationInView:touch.view]; } - (void)viewDidLoad { if (coord.x > icoord.x) { player.center = CGPointMake(player.center.x + 5, player.center.y); } if (coord.x < icoord.x) { player.center = CGPointMake(player.center.x - 5, player.center.y); } if (coord.y > icoord.y) { player.center = CGPointMake(player.center.x, player.center.y - 5); } if (coord.y < icoord.y) { player.center = CGPointMake(player.center.x, player.center.y + 5); } } Thanks.

    Read the article

  • how do I match movement of an object from 2d video into a 3d package ?

    - by George Profenza
    I'm trying to add objects in a 3d package(Blender) using recorded footage. I've played with Icarus and it's great to capture the camera movement. Also the Blender 2.41 importer script works in Blender 2.49 as well. The problem is I can't seem to get 3d coordinates for objects. I have tried Autodesk(RealVIZ) MatchMover 2011 and gone through the tutorials. Tutorial 3 shows how to link a vertex from a 3d mesh to a 2d trackpoint, but the setup is for camera movement. Tutorial 4 goes into Motion capture, but it uses 2 videos of the same motion taken with 2 cameras from different viewpoints. I've tried to bypass that using the same footage twice, but that failed, as the 3d coordinate system ends up messed up. What software do you recommend for this (mapping 3d coordinates to 2d tracked points and importing them into a 3d package) ? What is the recommended technique ? Any good examples out there ? Thanks, George

    Read the article

  • 3D plotting in Ubuntu

    - by Bakhtiyor
    I have Ubuntu 10.10 installed and need to plot 3D graphic. I have installed several free applications available in the repository, like QtiPlot and GNU Octave. I have found out and created the following graphic. Now I have to show in the same graphic the position of my experiment results, which consist of elements with three parameters: X, Y and Z coordinates which had been calculated with the same function as above graphic. Any idea to do that? Would be better if you propose solution in free apps,because there are several proprietary apps like Maple or MATLAB. Thank you very much. UPDATE 1 The final result should be more or less like this:

    Read the article

  • Render a 3D image as a 2D vector image

    - by Clinton Blackmore
    Is there any software that will take a 3d model (in any format) and allow you to render it as a 2D vector image (preferable as either an .SVG or .PDF)? My intention is to render LEGO building instructions this way. While there are many tools that let you view them or generate nice, rasterized output, I'd really like to be able to generate vectorized output. Textures are not needed, and hidden line removal may not be needed. I could use a tool that works on any platform (although my preference is OS X, Linux, then Windows). Open source is preferred. If no one knows of a tool that does this, does anyone have a good recommendation of something to hack on and add a feature to output via Cairo?

    Read the article

  • Restore iPod Touch

    - by Jason
    Ok.... So I'm part of the iOS developer program and downloaded iOS 6.0 Beta to my iPod Touch. But, I was having troubles getting XCode Beta to run so I tried to downgrade back to the current 5.1.1. Halfway through the downgrade, the process encountered an error. Now my iPod won't boot. I tried restoring it in iTunes, but I think it's trying to restore it to version 6.0, not finding it on the Apple website since it's not out yet, and failing. Any ideas on how to restore my ipod??

    Read the article

  • Google I/O 2010 - The SketchUp 3D API

    Google I/O 2010 - The SketchUp 3D API Google I/O 2010 - The SketchUp 3D API: Working with 3D geospatial data Geo 201 Matt Lowrie The world is a three dimensional space. Your geospatial applications should be showing it that way. This session will show how to create 3D data in Building Maker and then use the SketchUp API to customize that data to fit your needs. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 17 0 ratings Time: 58:28 More in Science & Technology

    Read the article

  • Is there a standard way to store 3D meshes to easily communicate between libraries?

    - by awiebe
    In a 3D game lots of different systems need to know about geometry data, however the only way they seem to be able to agree to on in representing it by an array of triangles. Can anyone recommend a good geometry manipulation library that will allow me to easily integrate the drawing library(OpenGL), the physics engine(Bullet), Serialization(Several 3D file formats) and my own code(objective-c++). Focus on the a representation between the drawing library and the physics engine. Also if the library can triangulate a mesh definition that would be very helpful. My code can work around what exists already.

    Read the article

  • What can i use as a 3d Tile map editor?

    - by alfa64
    I need to make grid based levels with 3d models for a dungeon crawler ( as a recent example Legend of Grimrock), but i need to have several layers and place entities with properties and position, angle, etc. I was considering Tiled, using layers as height for each level, but it's very hard to work with and visualize. What can i use for this pourpose? The output format needs to be json, xml, or something i can use on my engine. Ideally i'd want something like Tiled with a 3d visualization/edit mode and support for loading models or at least some visual representation of them.

    Read the article

  • How to enable 3D acceleration under VMware workstation 8?

    - by Yan Zhou
    I saw there are similar questions, but I don't think they answer exactly my questions. Did anyone managed to get 3D acceleration work under VMWare workstation 8? I have VMware 8.01 installed on Ubuntu 11.10. The guest I am trying is also Ubuntu 11.10. I manually installed vmware-tools and it went well, except the X-config part was skipped as it said the distribution driver is used. The guest runs well but it seems fall back to 2D mode. Does any one has any idea how to enable 3D acceleration under VMWare workstaion with Linux guest?

    Read the article

  • Why can't I use Unity 3D on a ATI Mobility Radeon HD 4200 Series with FGLRX drivers?

    - by user88257
    With a brand new install of 12.04 on my Dell Inspiron M5030, Unity 3D appears to load everything but the icons in the top left, and I am unable to click on anything. Unity 2D seems to works fine however. I have done nothing except install Synaptic Package manager and follow this guide to install FGLRX Drivers under Settings ? Additional Drivers, the driver shows as installed and functioning. Also after running: /usr/lib/nux/unity_support_test -p, I get this: OpenGL vendor string: ATI Technologies Inc. OpenGL renderer string: ATI Mobility Radeon HD 4200 Series OpenGL version string: 3.3.11653 Compatibility Profile Context Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes I am unsure as to what the problem is. I have tried the non-proprietary drivers, I could not get them to work either, but I would be willing to try again.

    Read the article

  • How can I create a 3D model in Java without using modeling software?

    - by Galen Nare
    I am a lightly experienced game developer and this is my first time trying 3D objects in Java for the first time. I have been recently creating and updating games using AWT, Swing, and Graphics, but I want to delve farther into Java. I have looked into Java3D, but it's not what I want. I want to use Images and then crop the Image and place the respective textures in their respective places. I already know how to do the cropping and 2D Image editing, but how do I go 3D?

    Read the article

  • is the AOC e2239fwt supported for multi touch on any ubuntu distro?

    - by HybriDPjT
    as the title says i have the e2239fwt monitor and ive tried ubuntu 10.04, 10.10, 11.04, 12.04 and now 13.04 and i cant get it to work. i should state that the single point touch seems to work ok but thats all. ive already tried looking and found no answers so here i am asking the peeps in the know :) i am currently running 13.04 and possibly going back to 10.04 if i cant get it to work or find that this monitor is in fact not supported.. hybridpjt@Unicorn:~$ lsusb Bus 002 Device 002: ID 05e3:0610 Genesys Logic, Inc. 4-port hub Bus 003 Device 002: ID 045e:0780 Microsoft Corp. Bus 003 Device 003: ID 06a3:0cc3 Saitek PLC Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 002 Device 003: ID 0408:3001 Quanta Computer, Inc. Optical Touch Screen

    Read the article

  • How to create a 3D world with 2D sprites similar to Ragnorak online?

    - by Romoku
    As far as I know Ragnorak Online is a 3D game world with 2D sprites overlayed. I would like to use this style in a game I am making in Unity, so I would like the player to be able to select little square tiles on the terrain. There are a couple routes I could take such as using a bunch of cubic polygons and linking them together or using one big map. The former approach doesn't seem to make any sense if the world is not flat as polygons wouldn't be reused often. The goal is to break down a 3D polygon into tiles which is heard to wrap my head around. I believe using something like an interval tree or array would be appropriate to store the rectangle grid, but how would I display a rectangle around the selection the player has his mouse over on the polygon terrain itself? Here is a screenshot. Here is a gameplay video. Here is the camera usage.

    Read the article

  • Love2D engine for Lua; What about 3D?

    - by shadowprotocol
    Lua has been really awesome to learn, it's so simple. I really enjoy scripting languages, and I had an equally enjoyable time learning Python. The Love engine, http://love2d.org/, is really awesome, but I'm looking for something that can handle 3D as well. Is there anything that accommodates 3D in Lua? I'm still intrigued by the particle system of LOVE anyway and may just turn my idea into a 2D project with Particle lighting :) EDIT: I removed comments about Python - I want this to be a Lua topic. Thanks

    Read the article

  • projective geometry: how do I turn a projection of a rectangle in 3D into a 2D view

    - by bonomo
    So the problem is that I have a 3D projection of a rectangle that I want to turn into 2D. That is I have a photo of a sheet of paper laying on a table which I want to transform into a 2D view of that sheet. So what I need is to get an undistorted 2D image by reverting all the 3D/projection transformations and getting a plain view of the sheet from the top. I happened to find some directions on the subject but they don't suggest an immediate instruction on how this can be achieved. It would be really helpful to get a step-by-step instruction of what needs to be done. Or, alternatively, a link on a resource that breaks it down to little details. Thank you

    Read the article

  • How to render a retro-like pixel graphics from 3d models?

    - by momijigari
    I was wondering if there's a possibility to render a retro-pixel-like graphics from 3d model in real time? I'm talking about the Starfarer-like graphics. I know it's hand drawn, and it's 2d. But if I need a 3d objects with the same aesthetics? I'm currently working with Flash. But I don't need any ready-solutions, I just want to understand the principle from any other platform if there is one. So if anybody met anything like this I would appreciate your help. (If it's not possible to do in real time, I could at least pre-render a sequence of sprites. It would be much better than creating hundreds of hand-drawn ones)

    Read the article

  • XNA: draw a sprite in 3d, is that possible?

    - by Heisenbug
    since now I always used sprited to draw in 2D: spriteBatch.Draw(myTexture, rectangle, color); (I suppose the texture is binded internally to 2 triangles and then scaled.) Now, I'm porting my game in 3D and I have to draw several planes (walls, floor, roof,..). Do I need to manually binding a texture to a geometry (for example using VertexPositionColorTexture with VertexBuffer and IndexBuffer), or is there any simpler way to do that? I'm looking for something like spriteBatch.Draw with the rectangle clip specified in 3d space: spriteBatch.Draw(myTexture, rectangleIn3D, color);

    Read the article

  • Retrieve the coordinates of the *occluding* (closest/drawn) pixels during 3D overlap, using OpenGL?

    - by Big Rich
    Hi, Sorry if the question is not worded well, I'm a new to both 3D and OpenGL. How could I go about obtaining the 3D coordinates of the occluding object, at the point where occlusion is happening (i.e. the 'intersection' of the object in front/closest to the screen)? Just to offer a [very] rudimentary, visual, example, if you were to form an index-finger cross, with your right hand closest to your face, I'd like to know the coordinates of the part of your right finger which obscures the other finger (obviously back within the OpenGL context - no jokers ;-) ). If there is a way to find out both about the occluder (hider) and the occluded (hidden) objects in OpenGL, then that would be of great use, also. Cheers Rich

    Read the article

  • Low-level 10-finger multi-touch data on the Nexus 7?

    - by Croad Langshan
    I'm considering getting a Nexus 7 to do some multi-touch development on Ubuntu in the run-up to 13.04 (i.e., now :-). What APIs, /dev files, or protocols are available, or could be made available with not too much work on my part? What data is available from the device? The data I want to get my hands on is -- if I can -- the same as I get from /dev/uinput/event* from an Apple Magic Trackpad, viz: positions of all touches (could be as many as 10 simultaneous touches, but much more typically 6 or fewer) their size/pressure (in both x and y directions) their angle their identity -- i.e. an integer that is somewhat reliably preserved across touch events, for as long as a finger doesn't lift off the surface Not all of this data is essential -- but the more of it there is, the merrier.

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >