Search Results

Search found 4528 results on 182 pages for 'touch'.

Page 82/182 | < Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >

  • Wakanda 3 : nouvel éditeur d'UI pour le Touch et le Mobile, la plateforme JavaScript de développement en démo au JS.everywhere()

    4D lance la préversion de Wakanda Une plateforme de développement et de déploiement d'applications 100 % JavaScript L'éditeur d'outils de développement français 4D vient de lancer une version preview pour développeur de sa solution Wakanda. Wakanda est la première plateforme « end-to-end » dédiée au développement d'applications Web totalement en JavaScript, côté client comme côté serveur. Ces applications sont ensuite accessibles depuis n'importe quel navigateur de bureau ou mobile. Wakanda est donc un environnement intégré de développement et de déploiement d'applications métiers sur internet utilisant du JavaScript à 100 %. [IMG]http://ftp-developpez.com/gordo...

    Read the article

  • Toutes les semaines un peu de code pour aller plus loin avec Windows 7, Le multi-touch

    En cette fin d'année, la communauté de Developpez.com s'est alliée avec Microsoft France pour relayer une série de questions / réponses sur le développement Windows 7. A partir d'aujourd'hui, nous poserons une question chaque lundi sur une fonctionnalité propre au développement d'applications Windows 7. La bonne réponse de la question de la semaine sera ensuite dévoilée la semaine suivante avec un exemple de mise en pratique. Êtes-vous prêt à relever le défi ? Pensez-vous bien connaître les possibilités que proposent les API Windows 7 ? C'est ce que nous allons voir dès aujourd'hui, nous attendons vos propositions ! La réponse de la semaine : Windows 7 est arrivé avec la gest...

    Read the article

  • JavaFX 2.2.4 Documentation

    - by user12610255
    JavaFX 2.2.4 and JDK 7u10 were released on Tuesday. In addition to the release documentation, the following new information is provided: A new document, Using the Image Ops API, describes how to read and write raw pixel data to and from JavaFX images. The Handling JavaFX Events document has been updated with more information on touch events. The Working with Touch Events chapter and Touch Events sample provide information about handling individual touch points to provide sophisticated responses to touch actions. The Implementing Best Practices document has been updated to include information about running tasks on background threads. The Troubleshooting section of Deploying JavaFX Applications now includes a section about disabling the automatic proxy configuration in your application code. Other documents were updated to reflect minor bug fixes. You can download JavaFX 2.2.4 from OTN. For all tutorials and API documentation, see http://docs.oracle.com/javafx.

    Read the article

  • How to fix flicker when using Webkit transforms & transitions

    - by gargantaun
    I have a very simple demo working that uses Webkit transforms and transitions for smooth horizontal scrolling between 'panels' (divs). The reason I want to go this route as opposed to a Javascript driven system is that it's for the iPad and Javascript performance is quite poor, but the css transforms and transitions are smooth as silk. Sadly though, I'm getting a lot of flicker on the iPad with my Demo. You can see the demo here You'll need safari or and iPad to see it in action. I've never seen this happening in any of the demos for transforms and transitions so I'm hopeful that this is fixable. Anyway here's the code that powers the thing.... The HTML looks like this. <html> <head> <title>Swipe Demo</title> <link href="test.css" rel="stylesheet" /> <link href="styles.css" rel="stylesheet" /> <script type="text/javascript" src="jquery.js"></script> <script type="text/javascript" src="functions.js"></script> <script type="text/javascript" src="swiping.js"></script> </head> <body> <div id="wrapper"> <div class='panel one'> <h1>This is panel 1</h1> </div> <div class='panel two'> <h1>This is panel 2</h1> </div> <div class='panel three'> <h1>This is panel 3</h1> </div> <div class='panel four'> <h1>This is panel 4</h1> </div> </div> </body> </html> The CSS looks like this body, html { padding: 0; margin: 0; background: #000; } #wrapper { width: 10000px; -webkit-transform: translateX(0px); } .panel { width: 1024px; height: 300px; background: #fff; display: block; float: left; position: relative; } and the javascript looks like this // Mouse / iPad Touch var touchSupport = (typeof Touch == "object"), touchstart = touchSupport ? 'touchstart' : 'mousedown', touchmove = touchSupport ? 'touchmove' : 'mousemove', touchend = touchSupport ? 'touchend' : 'mouseup'; $(document).ready(function(){ // set top and left to zero $("#wrapper").css("top", 0); $("#wrapper").css("left", 0); // get total number of panels var panelTotal; $(".panel").each(function(){ panelTotal += 1 }); // Touch Start // ------------------------------------------------------------------------------------------ var touchStartX; var touchStartY; var currentX; var currentY; var shouldMove = false; document.addEventListener(touchstart, swipeStart, false); function swipeStart(event){ touch = realEventType(event); touchStartX = touch.pageX; touchStartY = touch.pageY; var pos = $("#wrapper").position(); currentX = parseInt(pos.left); currentY = parseInt(pos.top); shouldMove = true; } // Touch Move // ------------------------------------------------------------------------------------------ var touchMoveX; var touchMoveY; var distanceX; var distanceY; document.addEventListener(touchmove, swipeMove, false); function swipeMove(event){ if(shouldMove){ touch = realEventType(event); event.preventDefault(); touchMoveX = touch.pageX; touchMoveY = touch.pageY; distanceX = touchMoveX - touchStartX; distanceY = touchMoveY - touchStartY; movePanels(distanceX); } } function movePanels(distance){ newX = currentX + (distance/4); $("#wrapper").css("left", newX); } // Touch End // ------------------------------------------------------------------------------------------ var cutOff = 100; var panelIndex = 0; document.addEventListener(touchend, swipeEnd, false); function swipeEnd(event){ touch = (touchSupport) ? event.changedTouches[0] : event; var touchEndX = touch.pageX; var touchEndY = touch.pageY; updatePanelIndex(distanceX); gotToPanel(); shouldMove = false; } // -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- function updatePanelIndex(distance){ if(distanceX > cutOff) panelIndex -= 1; if(distanceX < (cutOff * -1)){ panelIndex += 1; } if(panelIndex < 0){ panelIndex = 0; } if(panelIndex >= panelTotal) panelIndex = panelTotal -1; console.log(panelIndex); } // -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- function gotToPanel(){ var panelPos = getTotalWidthOfElement($(".panel")) * panelIndex * -1; $("#wrapper").css("-webkit-transition-property", "translateX"); $("#wrapper").css("-webkit-transition-duration", "1s"); $("#wrapper").css("-webkit-transform", "translateX("+panelPos+"px)"); } }); function realEventType(event){ e = (touchSupport) ? event.targetTouches[0] : event; return e; }

    Read the article

  • What are some useful SQL statements that should be known by all developers who may touch the Back en

    - by Jian Lin
    What are some useful SQL statements that should be known by all developers who may touch the Back end side of the project? (Update: just like in algorithm, we know there are sorting problems, shuffling problems, and we know some solutions to them. This question is aiming at the same thing). For example, ones I can think of are: Get a list of Employees and their boss. Or one with the employee's salary greater than the boss. (Self-join) Get a list of the most popular Classes registered by students, from the greatest number to the smallest. (Count, group by, order by) Get a list of Classes that are not registered by any students. (Outer join and check whether the match is NULL, or by Get from Classes table, all ClassIDs which are NOT IN (a subquery to get all ClassIDs from the Registrations table)) Are there some SQL statements that should be under the sleeve of all developers that might touch back end data?

    Read the article

  • iPad as programming platform--What future do touch screens have with programming?

    - by user94154
    I read this question a few weeks ago. I thought about it when I first saw the iPad. Do you think it would be possible to set up a development environment on the iPad? I think it would be awesome if there was an InstantRails App, a Django App, maybe even 280 North's Atlas could run on it :). Would you develop using an on-screen keyboard and a 10 inch screen? Steve Jobs seems to think touch-screens are the future of web browsing. What Future does touch have with programming?

    Read the article

  • Are there Vi/Vim users who aren't touch typists?

    - by michael
    I'm trying to write a Vim tutorial and I'd like to start by dismissing a few misconceptions, as well as giving some recommendations. I don't know if I should dismiss touch-typing as a misconception, or include it as a recommended prerequisite. At the time I learned the editor, I had already been touch typing for a couple of years, so I have absolutely no idea what would be the experience of a two-fingered typist in Vim. Are you a vim two-fingered typist? what has your experience been like?

    Read the article

  • Cocoa (Touch) for Swing Developers #1: Where Are the Layouts?

    - by yar
    My iPhone SDK and Objective-C learning is moving ahead quickly, thanks to several great books and online help (including this one). But I do have some basic questions due to what I already know that will be answered eventually, but I'd rather get a heads-up now if possible :) Are there equivalents for LayoutManagers in Cocoa Touch? Are they used, or is absolute positioning used instead? I have seen some of the layout stuff in IB, but I'm not sure what to look at in code. Aside from using the IB, are UIControls added directly to UIView instances using the addSubview (like add in Swing)? These are just two concrete questions that I've thought of just now, but I would love to see any translation of Swing concepts to Cocoa Touch.

    Read the article

  • Working of trashcan utility in tru64 Unix server.. or any other utility??

    - by RBA
    Hi, I used this mktrashcan command mktrashcan deleteMe1 trashcan/ And then i Deleted all the contents inside deleteMe1 directory(rm -rf*).. But then what happend is only the two text files which are inside the deleteMe1(deleteMe2.txt, deleteMe3.txt) directory were moved into the trashcan folder.. Rest of the directories and files inside the directories were not foundd!! Isn't there any other way, so that whatever is deleted, moves exactly the same way to the trashcan directory??? Or is there Any Other Utility that can perform the same task but in advance way.. mkdir deleteMe1 mkdir deleteMe1/deleteMe2 mkdir deleteMe1/deleteMe3 touch ./deleteMe1/deleteMe2/deleteMe4.txt touch ./deleteMe1/deleteMe2/deleteMe5.txt touch ./deleteMe1/deleteMe3/deleteMe6.txt touch ./deleteMe1/deleteMe3/deleteMe7.txt touch ./deleteMe1/deleteMe2.txt touch ./deleteMe1/deleteMe3.txt Thankss..

    Read the article

  • XNA Windows Phone 7 Sprite movement

    - by Darren Gaughan
    I'm working on a Windows phone game and I'm having difficulty with the sprite movement. What I want to do is make the sprite gradually move to the position that is touched on screen, when there is only one quick touch and release. At the minute all I can do is either make the sprite jump instantly to the touch location or move along to the touch location when the touch is held down. Code for jumping to touch location: TouchCollection touchCollection = TouchPanel.GetState(); foreach (TouchLocation tl in touchCollection) { if ((tl.State == TouchLocationState.Pressed) || (tl.State == TouchLocationState.Moved)) { Vector2 newPos = new Vector2(tl.Position.X,tl.Position.Y); if (position != newPos) { while (position.X < newPos.X) { position.X += (float)theGameTime.ElapsedGameTime.Milliseconds / 10.0f * spriteDirectionRight; } } } } Code to gradually move along while touch is held: TouchCollection touchCollection = TouchPanel.GetState(); foreach (TouchLocation tl in touchCollection) { if ((tl.State == TouchLocationState.Pressed) || (tl.State == TouchLocationState.Moved)) { Vector2 newPos = new Vector2(tl.Position.X,tl.Position.Y); if (position != newPos) { position.X += (float)theGameTime.ElapsedGameTime.Milliseconds / 10.0f * spriteDirectionRight; } } } These are in the Update() method of the Sprite class.

    Read the article

  • Can iPad/iPhone Touch Points be Wrong Due to Calibration?

    - by Kristopher Johnson
    I have an iPad application that uses the whole screen (that is, UIStatusBarHidden is set true in the Info.plist file). The main window's frame is set to (0, 0, 768, 1024), as is the main view in that frame. The main view has multitouch enabled. The view has code to handle touches: - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { for (UITouch *touch in touches) { CGPoint location = [touch locationInView:nil]; NSLog(@"touchesMoved at location %@", NSStringFromCGPoint(location)); } } When I run the app in the simulator, it works pretty much as expected. As I move the mouse from one edge of the screen to the other, reported X values go from 0 to 767. Reported Y values go from 20 to 1023, but it is a known issue that the simulator doesn't report touches in the top 20 pixels of the screen, even when there is no status bar. Here's what's weird: When I run the app on an actual iPad, the X values go from 0 to 767 as expected, but reported Y values go from -6 to 1017. The fact that it seems to work properly on the simulator leads me to suspect that real devices' touchscreens are not perfectly calibrated, and mine is simply reporting values six pixels too low. Can anyone verify that this is the case? Otherwise, is there anything else that could account for the Y values being six pixels off from what I expect? (In a few days, I should have a second iPad, so I can test this with another device and compare the results.)

    Read the article

  • Understanding MotionEvent to implement a virtual DPad and Buttons on Android (Multitouch)

    - by Fabio Gomes
    I once implemented a DPad in XNA and now I'm trying to port it to android, put, I still don't get how the touch events work in android, the more I read the more confused I get. Here is the code I wrote so far, it works, but guess that it will only handle one touch point. public boolean onTouchEvent(MotionEvent event) { if (event.getPointerCount() == 0) return true; int touchX = -1; int touchY = -1; pressedDirection = DPadDirection.None; int actionCode = event.getAction() & MotionEvent.ACTION_MASK; if (actionCode == MotionEvent.ACTION_UP) { if (event.getPointerId(0) == idDPad) { pressedDirection = DPadDirection.None; idDPad = -1; } } else if (actionCode == MotionEvent.ACTION_DOWN || actionCode == MotionEvent.ACTION_MOVE) { touchX = (int)event.getX(); touchY = (int)event.getY(); if (rightRect.contains(touchX, touchY)) pressedDirection = DPadDirection.Right; else if (leftRect.contains(touchX, touchY)) pressedDirection = DPadDirection.Left; else if (upRect.contains(touchX, touchY)) pressedDirection = DPadDirection.Up; else if (downRect.contains(touchX, touchY)) pressedDirection = DPadDirection.Down; if (pressedDirection != DPadDirection.None) idDPad = event.getPointerId(0); } return true; } The logic is: Test if there is a "DOWN" or "MOVED" event, then if one of this events collides with one of the 4 rectangles of my DPad, I set the pressedDirectin variable to the side of the touch event, then I read the DPad actual pressed direction in my Update() event on another class. The thing I'm not sure, is how do I get track of the touch points, I store the ID of the touch point which generated the diretion that is being stored (last one), so when this ID is released I set the Direction to None, but I'm really confused about how to handle this in android, here is the code I had in XNA: public override void Update(GameTime gameTime) { PressedDirection = DpadDirection.None; foreach (TouchLocation _touchLocation in TouchPanel.GetState()) { if (_touchLocation.State == TouchLocationState.Released) { if (_touchLocation.Id == _idDPad) { PressedDirection = DpadDirection.None; _idDPad = -1; } } else if (_touchLocation.State == TouchLocationState.Pressed || _touchLocation.State == TouchLocationState.Moved) { _intersectRect.X = (int)_touchLocation.Position.X; _intersectRect.Y = (int)_touchLocation.Position.Y; _intersectRect.Width = 1; _intersectRect.Height = 1; if (_intersectRect.Intersects(_rightRect)) PressedDirection = DpadDirection.Right; else if (_intersectRect.Intersects(_leftRect)) PressedDirection = DpadDirection.Left; else if (_intersectRect.Intersects(_upRect)) PressedDirection = DpadDirection.Up; else if (_intersectRect.Intersects(_downRect)) PressedDirection = DpadDirection.Down; if (PressedDirection != DpadDirection.None) { _idDPad = _touchLocation.Id; continue; } } } base.Update(gameTime); } So, first of all: Am I doing this correctly? if not, why? I don't want my DPad to handle multiple directions, but I still didn't get how to handle the multiple touch points, is the event called for every touch point, or all touch points comes in a single call? I still don't get it.

    Read the article

  • Question for Vim search and peck typists

    - by mike
    I'm trying to write a Vim tutorial and I'd like to start by dismissing a few misconceptions, as well as giving some recommendations. I don't know if I should dismiss touch-typing as a misconception, or include it as a recommended prerequisite. At the time I learned the editor, I had already been touch typing for a couple of years, so I have absolutely no idea what would be the experience of a two-fingered typist in Vim. Are you a vim two-fingered typist? what has your experience been like? EDIT: I'm not sure if my question was clear enough. Maybe it's my fault, I don't know. I get mixed replies and other questions (why do you write this? what does one have to do with the other?), instead of empirical info (I don't touch type and it's been (fine|hell)). Some programmers touch-type others search and peck. In the middle, there's Vim which requires a certain affinity with keys to do various operations. I am a touch typist and I have no clue what my experience would have been like with the editor if I wasn't. I can't honestly picture myself pecking some of these combos. But like I said, I don't know what it is like. Before telling someone to start using Vim, I'd like to know if I should dismiss touch-typing as a misconceived requirement. So, I'll rephrase the question, have you felt that not being a touch-typist has impeded on your experience with Vim?

    Read the article

  • iPhone multitouch - Some touches dispatch touchesBegan: but not touchesMoved:

    - by zkarcher
    I'm developing a multitouch application. One touch is expected to move, and I need to track its position. For all other touches, I need to track their beginnings and endings, but their movement is less critical. Sometimes, when 3 or more touches are active, my UIView does not receive touchesMoved: events for the moving touch. This problem is intermittent, and can always be reproduced after a few attempts: Touch the screen with 2 fingers. Touch the screen with another finger, and move this finger around. The moving finger always dispatches touchesBegan: and touchesEnded:, but sometimes does not dispatch any touchesMoved: events. Whenever the moving touch does not dispatch touchesMoved: events, I can force it to dispatch touchesMoved: if I move one of the other touches. This seems to "force" every touch to recheck its position, and I successfully receive a touchesMoved: event. However, this is clumsy. This bug is reproducible on both the iPhone 2G and 3GS models. My question is: How do I ensure that my moving touch dispatches touchesMoved: events? Does anyone have any experience with this issue? I've spent few fruitless days searching the web for answers. I found a post describing how to sync touch events with the VBL: http://www.71squared.com/2009/04/maingameloop-changes/ . However, this has not solved the problem. I really don't know how to proceed. Any help is appreciated!

    Read the article

  • Troubles moving a UIView.

    - by Joshua
    I have been trying to move a UIView by following a users touch. I have almost got it to work except for one thing, the UIView keeps flicking between two places. Here's the code I have been using: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchDown"); UITouch *touch = [touches anyObject]; firstTouch = [touch locationInView:self.view]; lastTouch = [touch locationInView:self.view]; [self.view setNeedsDisplay]; } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { InSightViewController *contentView = [[InSightViewController alloc] initWithNibName:@"SubView" bundle:[NSBundle mainBundle]]; [contentView loadView]; UITouch *touch = [touches anyObject]; currentTouch = [touch locationInView:self.view]; if (CGRectContainsPoint(contentView.view.bounds, firstTouch)) { NSLog(@"touch in subView/contentView"); sub.frame = CGRectMake(currentTouch.x - 50.0, currentTouch.y, 130.0, 21.0); } NSLog(@"touch moved"); lastTouch = currentTouch; [self.view setNeedsDisplay]; } And here's what's been happening: http://cl.ly/Sjx

    Read the article

  • Where can I find an iPhone OpenGL ES Example that responds to touch?

    - by Jamey McElveen
    I would like to find an iPhone OpenGL ES Example that responds to touch. Ideally it would meet these requirements: Displays a 3D object in the center of the screen like a cube Maps a texture to the cube surfaces Should move the camera around the cube as you drag your finger Should zoom the camera in and out on the cube by pinching Optionally has a background behind the cube that wraps around the back of the camera.(for example this could create the effect of the cube being in space) Has anyone seen one or more examples that can do these or at least render the cube with the texture?

    Read the article

  • Can an iPhone/iPod Touch application open a port for remote communication without jailbreaking?

    - by Derrick
    I'm researching remote control testing for an app that'll be installed on the new iPod Touch and can't tell for certain from everything that I've read whether or not an installed app can or can't open any ports for remote test instructions (that's a mouthful : ) We created something like this for the Android using adb port forwarding and telnet, and it worked really well. Is there any chance something similar could be done on an iPhone or iPod without jailbreaking??

    Read the article

< Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >