Search Results

Search found 723 results on 29 pages for 'touches canceled'.

Page 20/29 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • How can I click a button behind a transparent UIView?

    - by Sean Clark Hess
    Let's say we have a view controller with one sub view. the subview takes up the center of the screen with 100 px margins on all sides. We then add a bunch of little stuff to click on inside that subview. We are only using the subview to take advantage of the new frame ( x=0, y=0 inside the subview is actually 100,100 in the parent view). Then, imagine that we have something behind the subview, like a menu. I want the user to be able to select any of the "little stuff" in the subview, but if there is nothing there, I want touches to pass through it (since the background is clear anyway) to the buttons behind it. How can I do this? It looks like touchesBegan goes through, but buttons don't work.

    Read the article

  • change fillColor of selected CAShapeLayer

    - by Frank
    I'm trying to change the fillColor of a CAShapeLayer when the layer it's contained in is touched. I'm able to change the background color of the tapped layer like this: -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { CALayer *layer = [(CALayer *)self.view.layer.presentationLayer hitTest:point]; layer = layer.modelLayer; layer.backgroundColor = [UIColor blueColor].CGColor; } This turns the background of "layer" blue as expected. My problem is how do I change the color of the CAShapelayer inside "layer"? Thanks!

    Read the article

  • Phonegap: Slow response with vibrate notification

    - by jjei
    I created a simple android application with jquery and phonegap. When testing the app with phone, I noticed that vibration effect, that I have used to indicate that user touches a button, comes after a delay of maybe 0,5 seconds. This is way too long delay and just confuses the user. Is this just the downside of using phonegap? Or is there any configuration or additional frameworks which could be used to make the app response and produce the vibration more quickly? I installed the vibration plugin like this: phonegap local plugin add https://git-wip-us.apache.org/repos/asf/cordova-plugin-vibration.git I use the code below to create the vibration effect. navigator.notification.vibrate(200); My phone gap version is 3.0.0-0.14.3

    Read the article

  • Integrating virtual keyboard on a HP TouchSmart with an Adobe AIR app

    - by Alan
    Hi, Does anyone know if it's possible to integrate the ToushSmart's virtual keyboard with an Adobe AIR application? In most programs (Internet Explorer, Firefox, etc), when a user touches a text field a little keyboard icon automatically pops up which, when pressed, will bring up the virtual keyboard. However, this doesn't happen when clicking on text input fields in Adobe AIR applications. Has anyone had any experience working with AIR/Flash and touchscreens? Is there any API that can tell Windows (or the HP virtual keyboard specifically) that the user has clicked in a text field and that the virtual keyboard should be shown? The text fields are the standard kind (fl.controls.TextInput). Any suggestions would be greatly appreciated. Thanks in advance!

    Read the article

  • How to re-focus to a text field when focus is lost on a HTML form?

    - by Horace Ho
    There is only one text field on a HTML form. Users input some text, press Enter, submit the form, and the form is reloaded. The main use is barcode reading. I use the following code to set the focus to the text field: <script language="javascript"> <!-- document.getElementById("#{id}").focus() //--> </script> It works most of the time (if nobody touches the screen/mouse/keyboard). However, when the user click somewhere outside the field within the browser window (the white empty space), the cursor is gone. One a single field HTML form, how can I prevent the cursor from getting lost? Or, how to re-focus the cursor inside the field after the cursor is lost? thx!

    Read the article

  • VS2010 always relinks the project

    - by Rob Walker
    I am migrating a complex mixed C++/.NET solution from VS2008 to VS2010. The upgraded solution works in VS2010, but the build system is always refereshing one C++/CLI assembly. It doesn't recompile anything, but the linker touches the file. The causes a ripple effect downstream in the build as a whole bunch of dependent then get rebuilt. Any ideas on how to find out why it thinks it needs to relink the file? I've turned on verbose build logging, but nothing stands out.

    Read the article

  • How to implement Android Pull-to-Refresh

    - by yuku
    In Android applications such as Twitter (official app), when you encounter a ListView, you can pull it down (and it will bounce back when released) to refresh the content. I wonder what is the best way, in your opinion, to implement that? Some possibilities I could think of: An item on top of the ListView - however I don't think scrolling back to item position 1 (0-based) with animation on the ListView is an easy task. Another view outside the ListView - but I need to take care of moving the ListView position down when it is pulled, and I'm not sure if we can detect if the drag-touches to the ListView still really scroll the items on the ListView. Any recommendations?

    Read the article

  • View Animation (Resizing a Ball)

    - by user270811
    hi, i am trying to do this: 1) user long touches the screen, 2) a circle/ball pops up (centered around the user's finger) and grows in size as long as the user is touching the screen 3) once the user lets go of the finger, the ball (now in its final size) will bounce around. i think i have the bouncing around figure out from the DivideAndConquer example, but i am not sure how to animate the ball's growth. i looked at various view flipper examples such as this: http://www.inter-fuser.com/2009/08/android-animations-3d-flip.html but it seems like view flipper is best for swapping two static pictures. i wasn't able to find a good view animator example other than the flippers. also, i would prefer to use images as opposed to just a circle. can someone point me in the right direction? thanks.

    Read the article

  • Prevent status bar from receiving touch events

    - by Typeoneerror
    Edit After further testing, it appears that the part of my button that are not clickable are where the status bar used to be. I'm hiding the status bar with : // -- Override point for customization after app launch [[UIApplication sharedApplication] setStatusBarHidden:YES]; But it's still receiving touches. Any idea on how to disable this? Is there's a bounding box on an application that receives touch events? I created a few sample round rect buttons and placed them in different places in my view. The ones in the center of the view receive touch events (and show the highlighted blue color) but if I place a button near the edges of the view, only parts of them are clickable in the simulator. Is this because of Apples style guidelines? I placed a button exactly where a UITabNavigationItem would appear and only the bottom half of it is clickable.

    Read the article

  • Choosing http status code for unknown command reply

    - by w0rldart
    So, I'm writing a small test that I have been required to complete and I just want to give it some final touches by adding some header status code responses and some other stuff. Right now, my dilemma is what HTTP status code to choose for my "Unknown command" response after the $_GET['cmd'] has been compared to the existing commands list. case 404: $text = 'Not Found'; break; case 405: $text = 'Method Not Allowed'; break; case 406: $text = 'Not Acceptable'; break; For which one of the above should I go? And if none, which other?

    Read the article

  • NHibernate Lazy="Extra"

    - by Adam Rackis
    Is there a good explanation out there on what exactly lazy="extra" is capable of? All the posts I've seen all just repeat the fact that it turns references to MyObject.ItsCollection.Count into select count(*) queries (assuming they're not loaded already). I'd like to know if it's capable of more robust things, like turning MyObject.ItsCollection.Any(o => o.Whatever == 5) into a SELECT ...EXISTS query. Section 18.1 of the docs only touches on it. I'm not an NH developer, so I can't really experiment with it and watch SQL Profiler without doing a bit of work getting everything set up; I'm just looking for some sort of reference describing what this feature is capable of. Thank you!

    Read the article

  • Swipe the more than 2 Views?

    - by Silent
    - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { if (dirString) { CATransition *animation = [self getAnimation:dirString]; [[self superview] exchangeSubviewAtIndex:0 withSubviewAtIndex:1]; [[[self superview] layer] addAnimation:animation forKey:kAnimationKey]; } } Hello all Im trying to work with the code above it may look familiar its from the Iphone Developer Cookbook, Erica Sandun what im trying to implement are 5 different views using her swipe method the code above has a transition between 2 views only, how would i change the code so i can swipe through all five views, example: view starts on view 1 then user swipes right then changes to view 2 and so forth and backward your help is much appreciated

    Read the article

  • set different volumes inside app

    - by blacksheep
    i'd lioke to set the volume inside the touchesBegan action on half the volume of the IBAction. (void) awakeFromNib { [super awakeFromNib]; engine = [[Finch alloc] init]; E = [[RevolverSound alloc] initWithFile:PATH(@"E.wav")rounds:9]; } -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event allTouches] anyObject]; CGPoint location = [touch locationInView:self.view]; if(CGRectContainsPoint(Edrop.frame,location)){ [E play]; } } (IBAction)bass:(id)sender { if(CGRectIntersectsRect(finga.frame,e.frame)){ if(finga.center.y <= e.center.y) [E play]; } } thanx, blacksheep

    Read the article

  • Surface Detection in 2d Game?

    - by GamiShini
    I'm working on a 2D Platform game, and I was wondering what's the best (performance-wise) way to implement Surface (Collision) Detection. So far I'm thinking of constructing a list of level objects constructed of a list of lines, and I draw tiles along the lines. ( http://img375.imageshack.us/img375/1704/lines.png ). I'm thinking every object holds the ID of the surface that he walks on, in order to easily manipulate his y position while walking up/downhill. Something like this: //Player/MovableObject class MoveLeft() { this.Position.Y = Helper.GetSurfaceById(this.SurfaceId).GetYWhenXIs(this.Position.X) } So the logic I use to detect "droping/walking on surface" is a simple point (player's lower legs)-touches-line (surface) check (with some safety approximation - let`s say 1-2 pixels over the line). Is this approach OK? I`ve been having difficulty trying to find reading material for this problem, so feel free to drop links/advice.

    Read the article

  • Is there a high-level gestures library for iPhone development?

    - by n8gray
    The iPhone platform has a number of common gesture idioms. For example, there are taps, pinches, and swipes, each with varying number of fingers. But when you're developing an app, it's up to you to implement these things based on low-level information about the number and locations of touches. It seems like this is a prime candidate for a library. You would register a delegate, set some parameters like multi-tap interval and swipe threshold, and get calls like swipeStarted/Ended, pinchStarted/Ended, multiTap, etc. Does such a library exist?

    Read the article

  • iPhone:How to find the control interaction under Touch?

    - by user187532
    Hello friends, I have a UILabel control in a view. I want to detect touch event occurred when this label touched. I added the following code, which should work whenever touch happens on the view. - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { // This should be called only when the label is touched, not all the time.. [ [UIApplication sharedApplication] openURL:[NSURL URLWithString:@"http://www.mywebsite.com"] ]; } Whenever that particular label is touched my code should do the process further, not when the touch is happening anywhere in the view. How do i find out the particular label (or) any control is touched under touchesEnded function ? Could someone guide me on this? Thank you.

    Read the article

  • Moving a Ball on iPhone

    - by Chandan Shetty SP
    I am using below formula to move the ball circular, where accelX and accelY are the values from accelerometer, it is working fine. But the problem in this code is mRadius(I fixed its value to 50), i need to change mRadius according to accelerometer values and also i need bouncing effect when it touches other circles please send your answers ASAP... I am waiting. float degrees = -atan2(accelX, accelY) * 180 / 3.14159; int x = cCentrePoint.x + mRadius * cos(degreesToRadians(degrees)); int y = cCentrePoint.y + mRadius * sin(degreesToRadians(degrees)); Here is the snap of the game i want to develop. http://iphront.com/wp-content/uploads/2009/12/bdece528ea334033.jpg.jpg

    Read the article

  • Iphone resize doubt

    - by dragon
    Hi, I have create a view based application and its loaded correctly what i designed in the xib files.My doubt is when i designed uiview it has the property of resize its frame size(autoresize). But when i loaded the application into iphone the uiview has not the property of resize its frame automatically.Is it possible to change a uiview automatically in iphone (when application loaded)? (or) We can change the frame size of the uiview for every touches moved event. Can anyone help me ? Thanks in advance.......

    Read the article

  • setNeedsDisplayInRect: paints a white rectangle only

    - by Ion Tichy
    Hi, I'm still a little fresh to CoreGraphics programming, so please bear with me. I'm trying to write an application, which allows the user to rub stuff off an image with the finger. I have the basic functionality nailed down, but the result is sluggish since the screen is redrawn completely every time a touch is rendered. I did some research and found out that I can refresh only a portion of the screen using UIView's setNeedsDisplayInRect: method. This does call drawRect: as expected, however everything I draw in the drawRect: following the setNeedsDisplayInRect: is ignored. Instead, the area in the rect parameter is simply filled with white. No matter what I draw inside, all I end up with is a white rectangle. In essence, this is what I do: 1) when user touches screen, this touch is rendered into a mask 2) when the drawRect: is called, the image is masked with that mask There must be something simple I'm overlooking, surely?

    Read the article

  • Drag and drop + custom drawing in Android

    - by Rich
    I am working on something that needed custom drag-and-drop functionality, so I have been subclassing View, doing a bunch of math in response to touch events, and then rendering everything manually through code on the canvas in onDraw. Now, the more functionality I add, the more the code is growing out of control and I find myself writing a ton more code than I would expect to write in a high level environment like Android. Is this how it's done, or am I missing something? If I'm not doing anything fancy in the UI, the framework handles the majority of my interactions. Built-in controls handle the touches and drags, and my code is pretty much limited to business logic and data. Is there a way to leverage the power of some of the UI controls and things like animations while also doing some of it manually in the onDraw canvas? Is there an accepted standard of when to use one or the other (if indeed the two approaches can be mixed)?

    Read the article

  • How to make an scrollable UITextField Inside UItableViewCell?

    - by user333624
    Hello everyone. I created a bunch of editable UITableViewCell by embedding an UITextField inside, but I have seen some apps that allows you to scroll the UItableview by scrolling inside an inactive editable cell. How can I do that? And how can I also dismiss the keyboard when tapping somewhere else? I know about the method: - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event allTouches] anyObject]; if (touch.tapCount == 1) { [self resignFirstResponder]; } else { }} I put it inside my custom table view controller but the method doesn't seem to be called upon a tap, and I don't know if even if it gets called will dismiss the keyboard. Any help will be appreciated.

    Read the article

  • linq to sql report tables in query

    - by luke
    Here's the method i want to write: public static IEnumerable<String> GetTableNames(this IQueryable<T> query) { //... } where the IQueryable is a linq-to-sql query (is there a more specific interface i should use?). then if i had a query like this var q = from c in db.Customers from p in db.Products where c.ID = 3 select new {p.Name, p.Version}; q.GetTableNames();// return ["Customers", "Products"] basically it would show all the tables that this query touches in the db, it is ok to execute the query to figure this out too (since that is going to happen anyway)? any ideas?

    Read the article

  • Uniquely identify two instances of one browser that share Session state?

    - by jdk
    I want to make sure that a user isn't editing the same form data in two different browser tabs or windows (of the same web browser instance). It's to stop the user from stupidly overwriting their own data as they continue through a very long form process. On the server ongoing data input through the screens is collected into the Session. Assume for any browser, all tabs and windows run in the same instance of it (i.e. not each in a separate process). Obviously the browser tabs and windows share the same cookies in this scenario so cookie modification seems out of the question for viable solutions. This is also the reason they shared the same session. Considering that the form is already created and this is one of the final touches, how can I use ASP.NET, hopefully easily, to oversee this "feature"?

    Read the article

  • Android RelativeLayout spacing

    - by fordays
    Hi, I'm just putting the finishing touches to my Android app. Unfortunately, I dug straight into development without reading the documentation and built my layout with AbsoluteLayout and it turned out to look terrible when I loaded the app on my phone. Now I'm redoing the UI in a RelativeLayout and I want to put empty canvas space in between my ViewGroups in the y-direction. I am currently achieving this by putting random TextView sentences that are of the same color as my View's background in order to make psuedo-empty space. Is there a better way to do this, because right now when I define a specific ViewGroup to be placed below another View, it gets stuck right below the top View. As I was writing this, it dawned upon me that using padding might be the answer.... Any other suggestions?

    Read the article

  • Connecting ipad to external monitor

    - by Josh P.
    I am trying to connect my ipad app to an external screen using the following (not checking the correct resolution for no - just want it up and running). - (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions { [window addSubview:[navigationController view]]; if ([[UIScreen screens] count] > 1) { window.screen = [[UIScreen screens] objectAtIndex:1]; } [window makeKeyAndVisible]; return YES; } Is this supposed to redirect everything to the external monitor? How dows it work with touches/gestures? I see in the Apple apps, the controls etc are left on the ipad screen...

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >