Search Results

Search found 11662 results on 467 pages for 'android gesture'.

Page 412/467 | < Previous Page | 408 409 410 411 412 413 414 415 416 417 418 419  | Next Page >

  • Les menaces mobiles augmentent de 614 % entre 2012 et 2013, les entreprises sont de plus en plus exposées

    Les menaces mobiles augmentent de 614 % entre 2012 et 2013, les entreprises sont de plus en plus exposéesLe troisième rapport annuel publié sur les menaces mobiles par Juniper Networks, une société américaine spécialiste en équipement de télécommunications, fait état d'une croissance exponentielle des logiciels malveillants sur mobile. Un bond de 614 % entre mars 2012 et mars 2013 selon son MTC (Mobile Threat Center) soit un passage de 38 689 à 276 259 en nombre de logiciels malveillants. Sans grande surprise, les hackers concentrent leurs attaques sur Android qui est la plateforme la plus ...

    Read the article

  • Application isn't showing up in launcher

    - by CaldwellYSR
    So I downloaded eclipse earlier today so I can start to learn the Android SDK. I typically write java programs with vim from the command line but I was having trouble making the sdk work from the command line. Anyways Eclipse was begin weird so I uninstalled it with Synaptic, then I reinstalled. Now it seems to be working fine except I can't get eclipse to show up in the launcher... Is there anything I may be missing that would make the launcher show up?

    Read the article

  • Oracle Develop Online ???????!

    - by OTN-J Master
      4????????Oracle OpenWorld Tokyo 2012???????????Oracle Develop???????????????????????????????????????????????Oracle Develop????Database??Fusion Middleware??Server & Solaris??3?????????????????????????????????????????????????????????????????????????????????????????????? ??????????????????????????????????????????????? ????????Oracle OpenWorld 2012 SF???????????????????????????! Oracle Develop Online ????????? ??????????????????????????????????????????????????????????????????????????????????iPhone?????????? Android??????????

    Read the article

  • How to find (in javascript) the current "scroll" offset in mobile safari / iphone

    - by mintywalker
    I'd like to know the x/y offset of the how far the user has "scrolled" within the viewport in mobile safari on the iphone. Put another way, if I (through javascript) reloaded the current page, I'd like to find the values I'd need to pass into window.scrollTo(...) in order to reposition the document/viewport as it is currently. window.pageXOffset always reports 0 jquery's $('body').scrollTop() always reports 0 events have a pageX, but this won't account for the scrolling of the page that happens after you release your finger if your gesture was to "flick" the page up/down. Namely, it'll give me a point when the finger leaves the screen, but that doesn't always match where the page will be after it's finished scrolling. Any pointers?

    Read the article

  • does the accelerometer work for the iphone/ipad simulator?

    - by Mark
    From what I can tell, my app should be firing accelerometer events while Im using the iPad simulator in XCode, but its not. I have googled around and it somewhat seems that the accelerometer is not implemented in the simulator, is this correct? If so, why on earth would they have a "Hardware-Shake Gesture" menu option? My code is as follows: .h file: @interface MyViewController : UIViewController <UIPickerViewDataSource, UIPickerViewDelegate, UIAccelerometerDelegate>{ UIAccelerometer *accelerometer; //...other stuff } @property (nonatomic, retain) UIAccelerometer *accelerometer; @end then the .m file: @implementation MyViewController @synthesize accelerometer; - (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration { NSLog(@"%@", [NSString stringWithFormat:@"%@%f", @"X: ", acceleration.x]); NSLog(@"%@", [NSString stringWithFormat:@"%@%f", @"Y: ", acceleration.y]); NSLog(@"%@", [NSString stringWithFormat:@"%@%f", @"Z: ", acceleration.z]); } - (void)viewDidLoad { [super viewDidLoad]; self.accelerometer = [UIAccelerometer sharedAccelerometer]; self.accelerometer.updateInterval = .1; self.accelerometer.delegate = self; } @end Does this look right?

    Read the article

  • Differentiating Between UITouch Objects On The iPhone

    - by Jasarien
    Hey guys, I'm trying to differentiate between two (or more) UITouch objects on the iPhone. Specifically, I'd like to know the order in which the touches occurred. For instance, in my -touchesBegan:withEvent: method I get an NSSet of UITouch objects. Now I can find out how many touches there are, but, which object represents which finger? I notice the timestamp property on UITouch - is this what I'm looking for? I see how that would be useful to obtaining the last or first touch - providing the touches don't mutate... Therein lies my problem. I can use the timestamp to single out the latest touch, but then the touch that occurred first moves, and IT becomes the latest touch... At the end of this exercise, I'd like to be able to implement the "pinch" gesture to zoom in or out, etc. Any help would be greatly appreciated, thanks.

    Read the article

  • Using SketchFlow to prototype MS Surface Applications

    - by Isak Savo
    Hi We're doing mockups/prototyping on a MS Surface device and I wonder if anyone has succeeded in using SketchFlow for this. The problem that I see is that the code generated by the tool uses normal WPF controls (Button, etc.) instead of the contact aware surface counterparts (SurfaceButton) which means that they won't work nice on the surface unless you also use a mouse. Additionally, it would be nice if it was possible to hook in to other gesture events to trigger sketchflow transitions. Like the pinch gestures to switch page etc. Has anyone had any success with prototyping for Surface using Sketchflow. If so, how did you do it?

    Read the article

  • How to do iPad Photos app pinch to expand

    - by Macatomy
    I don't think this has been asked before on this site, but I might be wrong. Does anyone know the basics of how to get that whole effect with the iPad Photos app? Basically, pinching a stack of photos lets you have a "peek" at the photos in that stack, which expands based on the distance between your 2 fingers in the pinch, then fully completing the outwards pinch gesture opens the photos in the stack in a new view. See this video to get what I mean. I know of at least one third party app that uses the same method as the iPad Photo app, so I know it's possible to do. I'm guessing I would do something with UIPinchGestureRecognizer but I'm not sure exactly how to proceed.

    Read the article

  • Tracking graphics tablet input in C#

    - by Martin
    I'm writing an XNA application with C#, and I want to be able to control the application with a graphics tablet. All I need to be able to do is track the position and pressure of the pen on the tablet, it would be nice to have some built in gesture recognition but I'm willing to build that myself if needed. My first attempt at this was to use the vbtablet library. However, when I sent this to a friend it failed to work. It seems that the underlying technology is quite old and not supported by some tablets. My second attempt is to play with the Microsoft.Ink system. This looks promising, but I know very little about this subject and I'm struggling to make it work properly - for example I can't find how to read the raw tablet input and disable mouse pointer mode on the tablet when using an InkOverlay. What's the best system to use and what good documentation is there for a beginner in this field?

    Read the article

  • How to interpret trackpad pinch gestures to zoom IKImageBrowserView

    - by Fraser Speirs
    I have an IKImageBrowserView that I want to be able to pinch-zoom using a multi-touch trackpad on a recent Mac laptop. The Cocoa Event Handling Guide, in the section Handling Gesture Events says: The magnification accessor method returns a floating-point (CGFloat) value representing a factor of magnification ..and goes on to show code that adjusts the size of the view by multiplying height and width by magnification + 1.0. This doesn't seem to be the right approach for zooming IKImageBrowserView, whose zoomValue property is clamped between 0.0 and 1.0. So, does anyone know how to interpret the event in -[NSResponder magnifyWithEvent:] to zoom IKImageBrowserView?

    Read the article

  • Setting properties of auto-generated listboxitem

    - by DerKlaus
    I am trying to set the inputbindings of the auto-generated ListBoxItems of a databound ListBox. The code below does not work. The compiler complains that "The Property Setter 'InputBindings' cannot be set because it does not have an accessible set accessor." What is the correct syntax to set the InputBindings? <ListBox.ItemContainerStyle> <Style TargetType="{x:Type ListBoxItem}"> <Setter Property="ListBoxItem.InputBindings"> <Setter.Value> <MouseBinding Command="{Binding OpenCommand}" Gesture="LeftDoubleClick"/> </Setter.Value> </Setter> </Style> </ListBox.ItemContainerStyle> PS: Posting does not work with Opera 10.51

    Read the article

  • CAAnimation rotation did not give last stage?

    - by Mikhail Naimy
    hi, i am using following code to rotate the UIView through Swipe gesture .it works fine.but after rotation, it goes to beginning angle or Fromvalue.anyone can help to stop the view in last stage.if i rotate again, it must go from last stage. rotationAnimation = [CABasicAnimation animationWithKeyPath:@"transform.rotation.z"]; rotationAnimation.fromValue = [NSNumber numberWithFloat:0.0 * M_PI]; rotationAnimation.toValue = [NSNumber numberWithFloat:0.5 * M_PI]; //rotationAnimation.toValue = [NSNumber numberWithFloat: 2.5 * 3.15 ]; rotationAnimation.duration = 1.5; //rotationAnimation.cumulative = YES; rotationAnimation.removedOnCompletion = NO; rotationAnimation.repeatCount = 0.0; rotationAnimation.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear]; [self.view.layer addAnimation:rotationAnimation forKey:@"rotationAnimate"];

    Read the article

  • How to use UISwipeGestureRecognizer on UIButton?

    - by Ashutosh
    I have a UIbutton which i want to work as a joystick. So i am trying to add some gesture recognizer on the same button. I have this in my code right now: UISwipeGestureRecognizer *recognizer; recognizer = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:@selector(handleSwipeFrom:)]; [recognizer setDirection:(UISwipeGestureRecognizerDirectionUp)]; [self.gestureRecieverButton addGestureRecognizer:recognizer]; [recognizer release]; recognizer = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:@selector(handleSwipeFrom:)]; [recognizer setDirection:(UISwipeGestureRecognizerDirectionDown)]; [self.gestureRecieverButton addGestureRecognizer:recognizer]; [recognizer release]; -(void)handleSwipeFrom:(UISwipeGestureRecognizer *)recognizer { NSLog(@"Swipe received.%@",recognizer); } This is the error i am getting now: -[CUETutorialSixteenClusterRootController handleSwipeFrom:]: unrecognized selector sent to instance 0x79b71b0 2012-03-28 13:25:55.724 CUETrainer[1788:11f03] * Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[CUETutorialSixteenClusterRootController handleSwipeFrom:]: unrecognized selector sent to instance 0x79b71b0' But its not actually doing anything. Please help!!!!

    Read the article

  • Displaying menus on image

    - by Snehal
    Hi all, Here is what i am doing: I have a viewcontroller screen1 to which I have added long press gesture. On longPress, user will get draw menu. On selecting that menu, i am creating another uiviewcontroller class object which has a rectangle image. This instance i am adding as a subview to screen1. Now what i want is that once the image is displayed on the screen1, i want to display UImenuItems Done and cancel above the image drawn so that on tapping on done menu, the image is saved. I tried making the new viewcontroller as firstResponder after adding image to screen1 but the menus are not visible. Any one please help

    Read the article

  • UIView animation does not animate at first try?

    - by Bacalso Vincent
    Considering that my _palette's frame is like this: _palette.frame = CGRectMake(0,480,320,200); I have this code here to slide up/down a UIView: if(![_pallete superview]) { [self.view addSubview:_pallete]; [self.view insertSubview:_tempViewPaletteListener belowSubview:_pallete]; [UIView animateWithDuration:0.3 animations:^{ _pallete.top -= kPaletteHeight; } completion:^(BOOL isFinished) { }]; } else { [UIView animateWithDuration:0.3 animations:^{ _pallete.top += kPaletteHeight; } completion:^(BOOL isFinished) { [_tempViewPaletteListener removeFromSuperview]; [_pallete removeFromSuperview]; }]; } *the _tempViewPaletteListener is just a view with a tap gesture use to dismiss the palette* The problem is when I first try to run code here, the _palette view will just stiffly display right away. What I expected is, it should slide up the _palette view. Though it works fine after the first try

    Read the article

  • Can I enable PreviewClick using InputBindings in WPF?

    - by No hay Problema
    I want to detect when a user clicks on an item on a listview, without using events as I do command binding and I don't like all the nonsense of the behaviours. I have tried this: <ListView x:Name="MainList" Margin="2,8,6,8" Background="Black" ItemsSource="{Binding Path=AssetsVM.Data, Mode=OneWay}" BorderBrush="{x:Null}" > <ListView.InputBindings> <MouseBinding Command="{Binding Path=AssetsVM.SelectActivo}" CommandParameter="{Binding ElementName=MainList, Path=SelectedItem}" MouseAction="LeftClick" /> </ListView.InputBindings> This works fine if I click on the listview but does not work on the items, what I need is either a way to enable "Preview" or have a MouseAction/Gesture that behaves as preview, is it possible? Thanks

    Read the article

  • iphone dev - loading table content asynchronously

    - by Brian
    My app has a navigation controller which push and pop a series of views. One of the tableViews loads .xml file from URL and it takes 4-5 seconds. If I click the back button on the navigation bar, it will only respond after the content of the table finish loading. Is there an easy way to load the content asynchronously so that the app will still respond to my gesture on the navigation bar? p.s. I search this on the Internet and people are talking about multithreading. I don't know a lot about threads so please be more specific. Thanks in advanced =)

    Read the article

  • How do I cancel a text selection after the initial mousedown event?

    - by cwillu-gmail
    I'm trying to implement a pop-up menu based on a click-and-hold, positioned so that a (really) slow click will still trigger the default action, and with the delay set so that a text-selection gesture won't usually trigger the menu. What I can't seem to do is cancel the text-selection in a way that doesn't prevent text-selection in the first place: returning false from the event handler (or calling $(this).preventDefault()) prevents the user from selecting at all, and the obvious $().trigger('mouseup') doesn't doesn't do anything with the selection at all. This is in the general context of a page, not particular to a textarea or other text field. e.stopPropogation() doesn't cancel text-selection. I'm not looking to prevent text selections, but rather to veto them after some short period of time, if certain conditions are met.

    Read the article

  • Video/ Speech Development of Applications

    - by idea_
    Why do we continue to type and click away in IDEs when we could theoretically use hand gestures and speech to develop applications? Think about it - Developing a class by standing in-front of your computer, making some gesture, and yelling "CAR!". This doesn't have to strictly apply to OOP either. We have sufficient speech and image acquisition/ processing and analysis tools available to us, don't we? This seems plausible to me, but I may be overly ambitious. From a conceptual point-of-view, do you see any problems with the implementation?

    Read the article

  • How can I stream audio signals from various devices/computers to my home server?

    - by Breakthrough
    I currently have a headless home server set up (running Ubuntu 12.04 server edition) running a simple Apache HTTP server. The server is near an audio receiver, which controls a set of indoor and outdoor speakers in my home. Recently, my father purchased a Bluetooth adapter, which our various laptops and cellphones can connect to, outputting the music to the speakers. I was hoping to find a solution that worked over Wi-Fi, namely because it won't cost anything (I already have a server with an audio card), and it doesn't depend on Bluetooth. Is there any cross-platform (preferably free and open-source) solution that I can use which will allow me to stream audio to my home server, over my home network, from a wide variety of devices (laptops running Windows/Linux or cellphones running Android/BB/iOS)? I need something that works at least with Windows and Android. Also, just to clairfy, I want something that simply allows devices to connect to my server and output an audio signal without any action on the server end (since it's a server hidden away near my receiver). Any subsequent connection attempt should be dropped, so only one device can be in control of the stereo at once.

    Read the article

  • Thin client - cloud machine - to run via iPad, iPhone, most Androids etc

    - by Carl Lindberg
    I'm tired of having a laptop macbook that breaks down or having files that I need to sync via dropbox etc all the time via the machines to different OS installations. It sucks. I want a thin client where I can login on any machine - my iPhone, PC desktop, iPad etc to one running machine. I would like to replace a modernly powerful desktop iMac with a thin client running via my iPad. I will connect the iPad with a keyboard/mouse too so you get the idea. But I want to be able to use some of the Android phones as well (I guess most Android phones today has a good enough performance/resolution etc to run a thin client). Of course it has to be able to have input/output in sound. Printing can be solved by PDF/emailing etc - so no direct communication to the printer ports to USB etc is necessary. Is there such a service today? It should cost somewhere under something like $40/ month. I will run stuff like CPU heavy duty ableton for music production, xCode for making iOS apps, some games etc. And on the thin client also run virtual machines. VM of Ubuntu and Windows.

    Read the article

< Previous Page | 408 409 410 411 412 413 414 415 416 417 418 419  | Next Page >