Search Results

Search found 7346 results on 294 pages for 'touch flo 3d'.

Page 14/294 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • Reverse-projection 2D points into 3D

    - by ehsan baghaki
    Suppose we have a 3d Space with a plane on it with an arbitary equation : ax+by+cz+d=0 now suppose that we pick 3 random points on that plane: (x0,y0,z0) (x1,y1,z1) (x1,y1,z1) now i have a different point of view(camera) for this plane. i mean i have a different camera that will look at this plane from a different point of view. From that camera point of view these points have different locations. for example (x0,y0,z0) will be (x0',y0') and (x1,y1,z1) will be (x1',y1') and (x2,y2,z2) will be (x2',y2') from the new camera point of view. So here is my a little hard question! I want to pick a point for example (X,Y) from the new camera point of view and tell where it will be on that plane. All i know is that 3 points and their locations on 3d space and their projection locations on the new camera view. Do you know the coefficients of the plane-equation and the camera positions (along with the projection), or do you only have the six points? - Nils i know the location of first 3 points. therefore we can calculate the coefficients of the plane. so we know exactly where the plane is from (0,0,0) point of view. and then we have the camera that can only see the points! So the only thing that camera sees is 3 points and also it knows their locations in 3d space (and for sure their locations on 2d camera view plane). and after all i want to look at camera view, pick a point (for example (x1,y1)) and tell where is that point on that plane. (for sure this (X,Y,Z) point should fit on the plane equation). Also i know nothing about the camera location.

    Read the article

  • Which way to go in Linux 3D programming?

    - by Tek
    I'm looking for some answers for a project I'm thinking of. I've searched and from what I understand (correct me if I'm wrong) the only way the program I want to make will work is through 3D application. Let me explain. I plan to make a studio production program but it's unique in the fact that I want to be able to make it fluid. Let me explain. Imagine Microsoft's Surface program where you're able to touch and drag pictures across the screen. Instead of pictures I want them to be sound samples (wavs,mp3,etc). Of course instead the input will be with the mouse but if I ever do finish the project I would totally add touch screen input compatibility! Anyway, I'm guessing there's "physics" to do with it which is why I'm thinking that even though it'll be a 2D application I'll need to code it in a 3D environment. Assuming that I'm correct in how I want to approach my project, where can I start learning about 3D programming? I actually come from PHP programming which will make C++ easier for me to learn. But I don't even know where to start. If I'm not wrong OpenGL is the most up to date API as far as I know. Anyway, please give me your insights guys. I could really use some guidance here since I could totally be wrong in everything that I wrote :)

    Read the article

  • Get the touch position inside the imageview in android

    - by Manikandan
    I have a imageview in my activity and I am able to get the position where the user touch the imageview, through onTouchListener. I placed another image where the user touch over that image. I need to store the touch position(x,y), and use it in another activity, to show the tags. I stored the touch position in the first activity. In the first activity, my imageview at the top of the screen. In the second activity its at the bottom of the screen. If I use the position stored from the first acitvity, it place the tag image at the top, not on the imageview, where I previously clicked in the first activity. Is there anyway to get the position inside the imageview. FirstActivity: cp.setOnTouchListener(new OnTouchListener() { @Override public boolean onTouch(View v, MotionEvent event) { // TODO Auto-generated method stub Log.v("touched x val of cap img >>", event.getX() + ""); Log.v("touched y val of cap img >>", event.getY() + ""); x = (int) event.getX(); y = (int) event.getY(); tag.setVisibility(View.VISIBLE); int[] viewCoords = new int[2]; cp.getLocationOnScreen(viewCoords); int imageX = x - viewCoords[0]; // viewCoods[0] is the X coordinate int imageY = y - viewCoords[1]; // viewCoods[1] is the y coordinate Log.v("Real x >>>",imageX+""); Log.v("Real y >>>",imageY+""); RelativeLayout rl = (RelativeLayout) findViewById(R.id.lay_lin); ImageView iv = new ImageView(Capture_Image.this); Bitmap bm = BitmapFactory.decodeResource(getResources(), R.drawable.tag_icon_32); iv.setImageBitmap(bm); RelativeLayout.LayoutParams params = new RelativeLayout.LayoutParams( RelativeLayout.LayoutParams.WRAP_CONTENT, RelativeLayout.LayoutParams.WRAP_CONTENT); params.leftMargin = x; params.topMargin = y; rl.addView(iv, params); Intent intent= new Intent(Capture_Image.this,Tag_Image.class); Bundle b=new Bundle(); b.putInt("xval", imageX); b.putInt("yval", imageY); intent.putExtras(b); startActivity(intent); return false; } }); In TagImage.java I used the following: im = (ImageView) findViewById(R.id.img_cam22); b=getIntent().getExtras(); xx=b.getInt("xval"); yy=b.getInt("yval"); im.setOnTouchListener(new OnTouchListener() { @Override public boolean onTouch(View v, MotionEvent event) { int[] viewCoords = new int[2]; im.getLocationOnScreen(viewCoords); int imageX = xx + viewCoords[0]; // viewCoods[0] is the X coordinate int imageY = yy+ viewCoords[1]; // viewCoods[1] is the y coordinate Log.v("Real x >>>",imageX+""); Log.v("Real y >>>",imageY+""); RelativeLayout rl = (RelativeLayout) findViewById(R.id.lay_lin); ImageView iv = new ImageView(Tag_Image.this); Bitmap bm = BitmapFactory.decodeResource(getResources(), R.drawable.tag_icon_32); iv.setImageBitmap(bm); RelativeLayout.LayoutParams params = new RelativeLayout.LayoutParams( 30, 40); params.leftMargin =imageX ; params.topMargin = imageY; rl.addView(iv, params); return true; } });

    Read the article

  • Slow Javascript touch events on Android

    - by oneself
    I'm trying to write a simple html based drawing application (standalone simplified code attached bellow). I've tested this on the following devices: iPad 1 and 2: Works great ASUS T101 running Windows: Works great Samsung Galaxy Tab: Extremely slow and patchy -- unusable. Lenovo IdeaPad K1: Extremely slow and patchy -- unusable. Asus Transformer Prime: Noticeable lag compare with the iPad -- close to usable. The Asus tablet is running ICS, the other android tablets are running 3.1 and 3.2. I tested using the stock Android browser. I also tried the Android Chrome Beta, but that was even worse. My questions is why are the Android tablets so slow? Am I doing something wrong or is it an inherit problem with Android OS or browser, or is there anything I can do about it in my code? multi.html: <html> <body> <style media="screen"> canvas { border: 1px solid #CCC; } </style> <canvas style="" id="draw" height="450" width="922"></canvas> <script class="jsbin" src="jquery.js"></script> <script src="multi.js"></script> </body> </html> multi.js: var CanvasDrawr = function(options) { // grab canvas element var canvas = document.getElementById(options.id), ctxt = canvas.getContext("2d"); canvas.style.width = '100%' canvas.width = canvas.offsetWidth; canvas.style.width = ''; // set props from options, but the defaults are for the cool kids ctxt.lineWidth = options.size || Math.ceil(Math.random() * 35); ctxt.lineCap = options.lineCap || "round"; ctxt.pX = undefined; ctxt.pY = undefined; var lines = [,,]; var offset = $(canvas).offset(); var eventCount = 0; var self = { // Bind click events init: function() { // Set pX and pY from first click canvas.addEventListener('touchstart', self.preDraw, false); canvas.addEventListener('touchmove', self.draw, false); }, preDraw: function(event) { $.each(event.touches, function(i, touch) { var id = touch.identifier; lines[id] = { x : this.pageX - offset.left, y : this.pageY - offset.top, color : 'black' }; }); event.preventDefault(); }, draw: function(event) { var e = event, hmm = {}; eventCount += 1; $.each(event.touches, function(i, touch) { var id = touch.identifier, moveX = this.pageX - offset.left - lines[id].x, moveY = this.pageY - offset.top - lines[id].y; var ret = self.move(id, moveX, moveY); lines[id].x = ret.x; lines[id].y = ret.y; }); event.preventDefault(); }, move: function(i, changeX, changeY) { ctxt.strokeStyle = lines[i].color; ctxt.beginPath(); ctxt.moveTo(lines[i].x, lines[i].y); ctxt.lineTo(lines[i].x + changeX, lines[i].y + changeY); ctxt.stroke(); ctxt.closePath(); return { x: lines[i].x + changeX, y: lines[i].y + changeY }; }, }; return self.init(); }; $(function(){ var drawr = new CanvasDrawr({ id: "draw", size: 5 }); });

    Read the article

  • Does the Wacom Bamboo Pen & Touch work out of the box?

    - by Emilien
    Is there any tweaking involved in Ubuntu 10.10 to make the Wacom Bamboo Pen & Touch work? And is this hardware getting some love from the new multitouch framework? If there's no multitouch support for it, then I'd fall back on the simpler (and cheaper) Wacom Bamboo Pen (to draw, no multitouch)... ENAC's general list of Linux multitouch devices states the following regarding Wacom: "The 'wacom' kernel driver handles these, and is undergoing work to make it compliant with the kernel multitouch protocol." But is this also compatible with Ubuntu's multitouch protocol (which I understand is a different effort than the kernel's)

    Read the article

  • How can I enable Unity 3d after installing Bumblebee? GLX Problems

    - by ashley
    I'm new to Ubuntu, I'm running 12.04 64 Bit on a Dell XPS L207x with a Nvidia GT555M card. From what I could understand online I needed to install Bumblebee to get the most out of the Optimus system and better battery life. I can test the Bumblebee is working by running optirun glxgears for example. If I run just glxgears then I get the following error Error: couldn't get an RGB, Double-buffered visual. I'm also unable to run Unity 3d, which I would very much like. I'd greatly appreciate any and all help, please be gentle.

    Read the article

  • 3D Display Issue When Using Latest Java Runtime Versions - Patch now available...

    - by [email protected]
    Typically I focus my blog posts on Support process topics, and reserve most of the technical topics for the Support newsletter. This topic, however, warrants a quick mention in the blog since I know it's been affecting many users recently. For customers using the Client/Server Deployment of AutoVue, users that had upgraded their client Java Runtime Environment (JRE) to version 1.6.0_19 or later suddenly noticed that their 3D files were opening blank in AutoVue. This issue was due to a change in JRE version 1.6.0_19, and the AutoVue team now offers a patch to address the issue in AutoVue version 20.0.0. The patch number is 10268316, is available through the My Oracle Support portal, and is described further in KM Note 1104821.1. We'll mention it again in our next Support newsletter, and the AutoVue team will target to roll the same fix into the next available release of the product.

    Read the article

  • trying to install ubuntu touch to nexus 4 - can't connect device (14.04 in virtual box on mac)

    - by Fiona Cox
    I'm trying to install Ubuntu touch to a nexus 4. I've followed all the steps so far, downloaded the packages and gotten the phone ready, but now I got to the step of connecting it to the computer [under 'Enable USB Debugging'], but it's not listed when I try $ adb devices (I tried the 'adb kill-server' command first too, but nothing is listed). I'm sure it's something simple I'm forgetting, but can anyone help please? I have debugging enabled. Running Trusty in VirtualBox on MacBook. Thanks!

    Read the article

  • No Microphone error on iPod Touch

    - by Bob Vork
    I've build an iPhone app that should work on an iPod Touch as well, but I'm getting reports that the app is not working on iPod touches. It's displaying an error message saying there's no mic available on the device. The thing is, the app does nothing whatsoever with audio, and I can't find anything related in the project settings. The other problem is I don't have an iPod Touch available to test this myself. Are some people running an old firmware version? Am I compiling the wrong firmware version? To my surprise I couldn't find anything about this on SO or Google… Any help is appreciated

    Read the article

  • iPhone: CALayer + rotate in 3D + antialias?

    - by Colin
    Hi all, An iPhone SDK question: I'm drawing a UIImageView on the screen. I've rotated it in 3D and provided a bit of perspective, so the image looks like it's pointing into the screen at an angle. That all works fine. Now the problem is the edges of the resulting picture don't seem to be antialiased at all. Anybody know how to make it so? Essentially, I'm implementing my own version of CoverFlow (yeah yeah, design patent blah blah) using quartz 3d transformations to do everything. It works fine, except that each cover isn't antialiased, and Apples version is. I've tried messing around with the edgeAntialisingMask of the CALayer, but that didn't help - the defaults are that every edge should be antialiased... thanks!

    Read the article

  • touchesEnded not being called??? or randomly being called

    - by Rob
    If I lift my finger up off the first touch, then it will recognize the next touch just fine. It's only when I hold my first touch down continuously and then try and touch a different area with a different finger at the same time. It will then incorrectly register that second touch as being from the first touch again. Update It has something to do with touchesEnded not being called until the very LAST touch has ended (it doesn't care if you already had 5 other touches end before you finally let go of the last one... it calls them all to end once the very last touch ends) - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch* touch = [touches anyObject]; NSString* filename = [listOfStuff objectAtIndex:[touch view].tag]; // do something with the filename now } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { ITouch* touch = [touches anyObject]; NSString* buttonPressed = [listOfStuff objectAtIndex:[touch view].tag]; // do something with this info now }

    Read the article

  • iPhone: Activate UISlider and set its value to the location of the current touch programatically

    - by carloe
    Is it possible to set an UISlider as first responder and set its current value to the location of the current touch programatically? The way my app is set up I have a UIView container that takes up the whole screen. Inside the container I have another UIView offscreen at the bottom edge (I'll call this bottomBar). Inside the bottomBar there is a UISlider element. Right now, when the user swipes along the bottom edge of the screen the bottomBar the slider it contains slide up. What I am trying to achieve is to activate the UISlider, and set the position of the slider (the value) to the position of the users touch. Is this possible? Could someone please point me in the right direction?

    Read the article

  • How to get a continuous Touch Event?

    - by daliz
    My class extends View and I need to get continuous touch events on it. If I use: public boolean onTouchEvent(MotionEvent me) { if(me.getAction()==MotionEvent.ACTION_DOWN) { myAction(); } return true; } ... the touch event is captured once. What if I need to get continuous touches without moving the finger? Please, tell me I don't need to use threads or timers. My app is already too much heavy. Thanks.

    Read the article

  • translation/rotation of a HUD against a camera using vectors in Euclidian 3D space

    - by Jakob
    i've got 2 points in 3D space: the camera position and the camera lookAt. the camera movement is restricted akin to typical first person shooter games. you can move the cam freely, tilt horizontally and up to 90 degrees vertically, but not roll. so now i want to draw a HUD to the screen, on which i can move the mouse freely, with the position of the cursor correctly translating into 3D space. the easy part was to draw something directly in front of the camera. V0 = camPos; V1 = lookAt; V2 = lookAt-camPos; normalize V2; mutiply V2 according to camera frustum V3 = V0+V2 draw something at V3 now the part i don't get: i could use V3 and add to that the rotations of the cam combined with the x/y of the mouse cursor, somehow, right? that's what i want.

    Read the article

  • Limit to area that receives touch events

    - by Typeoneerror
    Is there's a bounding box on an application that receives touch events? I created a few sample round rect buttons and placed them in different places in my view. The ones in the center of the view receive touch events (and show the highlighted blue color) but if I place a button near the edges of the view, only parts of them are clickable in the simulator. Is this because of Apples style guidelines? I placed a button exactly where a UITabNavigationItem would appear and only the bottom half of it is clickable.

    Read the article

  • More complex view matrix calculation required to composite 3d models with 2d video

    - by lzcd
    I'm utilising some 2d / 3d tracking data (provided by pfHoe) to help integrate some 3d models into the playback of some 2d video. Things are working.... okay... but there's still some visible 'slipping' of the models against the video background and I suspect this is may be because the XNA CreatePerspective helper method isn't taking into account some of the additional data supplied by pfHoe such as independent horizontal / vertical field of view angles and focal length. Would anyone be able to point me towards some examples of constructing view matrices that include such details?

    Read the article

  • Prevent status bar from receiving touch events

    - by Typeoneerror
    Edit After further testing, it appears that the part of my button that are not clickable are where the status bar used to be. I'm hiding the status bar with : // -- Override point for customization after app launch [[UIApplication sharedApplication] setStatusBarHidden:YES]; But it's still receiving touches. Any idea on how to disable this? Is there's a bounding box on an application that receives touch events? I created a few sample round rect buttons and placed them in different places in my view. The ones in the center of the view receive touch events (and show the highlighted blue color) but if I place a button near the edges of the view, only parts of them are clickable in the simulator. Is this because of Apples style guidelines? I placed a button exactly where a UITabNavigationItem would appear and only the bottom half of it is clickable.

    Read the article

  • Level of Detail for 3D terrains/models in Mobile Devices (Android / XNA )

    - by afriza
    I am planning to develop for WP7 and Android. What is the better way to display (and traverse) 3D scene/models in term of LoD? The data is planned to be island-wide (Singapore). 1) Real-Time Dynamic Level of Detail Terrain Rendering 2) Discrete LoD 3) Others? And please advice some considerations/algorithms/resources/source codes. something like LoD book also Okay. Side note: I am a beginner in this area but pretty well-versed in C/C++. And I haven't read the LoD book. Related posts: - Distant 3D object rendering [games]

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >