Search Results

Search found 9296 results on 372 pages for 'special touch'.

Page 112/372 | < Previous Page | 108 109 110 111 112 113 114 115 116 117 118 119  | Next Page >

  • iPhone hitTest broken after rotation

    - by Adam
    Hi all, I have a UIView that contains a number of CALayer subclasses. I am using the following code to detect which layer a touch event corresponds to: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; CGPoint touchPoint = [touch locationInView:self]; NSLog(@"%@,%@",NSStringFromCGPoint(point),[self.layer hitTest:point].name); } This works fine until the device is rotated. When the device is rotated all current layers are removed from the superlayer, and new CALayers are created to fit the new orientation. The new layers are correctly inserted and viewable in the correct orientation. After the rotation the hitTest method consistently returns nil when clearly clicking on the newly created layers and registers for locations of layers which are incorrect. The coordinates of the hit test are correct, but no layers are found. Am I missing a function call or something after handling the rotation? Cheers, Adam

    Read the article

  • Why doesn't Apache2::SubProcess spawn my subprocess?

    - by codeholic
    The following script works without errors, but /tmp/test.touch is not being created (even being checked later in the command line). It seems to me as if $r->spawn_proc_prog doesn't spawn a process. What may cause the problem? #!/usr/bin/perl use strict; use warnings; use Apache2::RequestUtil; use Apache2::SubProcess (); my $r = Apache2::RequestUtil->request; print "Content-Type: text/plain\n\n"; print eval { $r->spawn_proc_prog('/usr/bin/touch', ['/tmp/test.touch']) } ? `ls -l /tmp/test.touch` : $@;

    Read the article

  • iPhone SDK - Implement Tap to Scroll feature on ImageScrollView

    - by NobodyNobody
    SDK: xCode 3.2 Device: iPhone 3GS OS 3.13 / iPad Case: In Apple Sample Library [ScrollViewSuite]. The [2_Autoscroll] project. In [ThumbImageView], We can drag and drop to scroll the view. So that we can see any menu item by touch the screen and do moving. How to implement this feature in [TagDetectingImageView] ? I have try to copy [touch event] function from [ThumbImageView], modify it and implement to [TagDetectingImageView]. (Just rename the [ThumbImageView] to [TagDetectingImageView] inside those function) But when i touch and move on the [TagDetectingImageView], the view will moved out of the screen (Don't know where is it) Source - Please find the 2_Autoscroll source code in: http://developer.apple.com/iPhone/library/samplecode/ScrollViewSuite/Introduction/Intro.html Thanks

    Read the article

  • iPhone drag/drop

    - by Farid
    Trying to get some basic drag/drop functionality happening for an iPhone application. My current code for trying to do this is as follows: - (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event { UITouch *touch = [touches anyObject]; CGPoint location = [touch locationInView:self]; self.center = location; } This code has the result of a the touched UIView flickering while it follows the touch around. After playing around with it a bit, I also noticed that the UIView seemed to flicker from the 0,0 position on the screen to the currently touched location. Any ideas what I'm doing wrong?

    Read the article

  • Virtual/soft buttons for (Home, menu,Back, Search) always on top?

    - by Ken
    How can I make an app or maybe service that looks like (Nexus One touch buttons) for the navigation keys (Home, menu,Back, Search) The buttons should always be visibly and always stay on top and send the command to the app thats running. Someone have ideas and sample codes how to do that? * I see an app with name (Smart Taskmanager) wich always detect when you touch the right side of the screen and then detect when you slide the finger to left. So I think its possible, with this function I think its possible to implementate the code to simulate the (Home, Meny, Back, Search) buttons. * I also see and test an app wich show a "cracked display" always ontop so that tecnic maybe shold be useful to always show the buttons/bitmanp on top. Thoose function, to show the button and catch the "touch event" and send the event to the active program, thats what i dont can figure out how to do. Thats my thought! I Hope there are some deep developer wich know this solution! Regards Ken

    Read the article

  • Calling a method in a view controller from a view

    - by Lakshmie
    I have to invoke a method present in a view controller who's reference is available in the view. When I try to call the method like any other method, for some reason, iPhone just ignores the call. Can somebody explain as to why this happens and also how can I go about invoking this method? In the view I have this method: -(void) touchesBegan :(NSSet *) touches withEvent:(UIEvent *)event{ NSArray* mySubViews = [self subviews]; for (UITouch *touch in touches) { int i = 0; for(; i<[mySubViews count]; i++){ if(CGRectContainsPoint([[mySubViews objectAtIndex:i] frame], [touch locationInView:self])){ break; } } if(i<[mySubViews count]){ // viewController is the reference to the View Controller. [viewController pointToSummary:[touch locationInView:self].y]; NSLog(@"Helloooooo"); break; } } } Whenever the touches event is triggered, Hellooooo gets printed in the console but the method before that is simply ignored

    Read the article

  • Malloc to a CGPoint Pointer throwing EXC_BAD_ACCESS when accessing

    - by kdbdallas
    I am trying to use a snippet of code from a Apple programming guide, and I am getting a EXC_BAD_ACCESS when trying to pass a pointer to a function, right after doing a malloc. (For Reference: iPhone Application Programming Guide: Event Handling - Listing 3-6) The code in question is really simple: CFMutableDictionaryRef touchBeginPoints; UITouch *touch; .... CGPoint *point = (CGPoint *)CFDictionaryGetValue(touchBeginPoints, touch); if (point == NULL) { point = (CGPoint *)malloc(sizeof(CGPoint)); CFDictionarySetValue(touchBeginPoints, touch, point); } Now when the program goes into the if statement it assigns the 'output' of malloc into the point variable/pointer. Then when it tries to pass point into the CFDictionarySetValue function it crashes the application with: Program received signal: “EXC_BAD_ACCESS”. Someone suggested not doing the malloc and pass the point var/pointer as: &point, however that still gave me a EXC_BAD_ACCESS. What I am (and it looks like Apple) doing wrong??? Thanks in advance.

    Read the article

  • Cocos2D TouchesEnded not allowing me to access sprites?

    - by maiko
    Hey Guys! Thanks so much for reading! - (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { UITouch * touch = [touches anyObject]; CGPoint location = [[CCDirector sharedDirector] convertToGL: [touch locationInView:touch.view]]; CGRect myRect = CGRectMake(100, 120, 75, 113); int tjx = sprite.position.x; if(CGRectContainsPoint(myRect, location)) { tjx ++; } } For some reason, ccTouchesEnded isn't allowing me to access my "sprite". I also tried to use CGRectMake like so : CGRectMake( sprite.position.x, sprite.position.y, sprite.contentSize.Width, sprite.contentSize.Height) But I couldn't access my sprites position or height. I keep getting "sprite" undeclared when it is declared in the init method, and added to the child. Please help!! I'm sure i'm missing something really simple here.

    Read the article

  • Getting x/y coordinate of a UITouch...

    - by Tarek
    HI, I have been trying to get the x/y coordinates from a touch on any iDevice. When getting the touch locations, everything looks ok if the touch is in the middle of the screen. But if I drag my finger to the bottom of the screen, I can only get a y coordinate of 1015. It should be getting to 1023. Same thing for dragging my finger to the top of the screen. I get -6. It should be 0. I have explicitly set the window and views to an origin of 0,0 and the width, height of the device's screen. Still nothing. I am really lost on what might be going on. Is something shifted? Am I not reading the x/y coordinates properly. Does something need to be transformed or converted? Any help would be much appreciated. T

    Read the article

  • touchesBegan doesnt get detected

    - by Muniraj
    I have a viewcontroller like the following. But the touchsBegan doestnt get detected. Can anyone plz tell me what is wrong. - (id)init { if (self = [super init]) self.view = [[[UIView alloc] initWithFrame:[[UIScreen mainScreen] applicationFrame]] autorelease]; return self; } -(void) viewWillAppear:(BOOL)animated { overlay = [[[UIImageView alloc] initWithImage:[UIImage imageNamed:@"overlay.png"]] autorelease]; [self.view addSubview:overlay]; } - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { // Detect touch anywhere UITouch *touch = [touches anyObject]; // Where is the point touched CGPoint point = [touch locationInView:self.view]; NSLog(@"pointx: %f pointy:%f", point.x, point.y); // Was a tab touched, if so, which one... if (CGRectContainsPoint(CGRectMake(1, 440, 106, 40), point)) NSLog(@"tab 1 touched"); else if (CGRectContainsPoint(CGRectMake(107, 440, 106, 40), point)) NSLog(@"tab 2 touched"); else if (CGRectContainsPoint(CGRectMake(214, 440, 106, 40), point)) NSLog(@"tab 3 touched"); }

    Read the article

  • Two VPN (internet) connections rounting (win2003)

    - by tmp3128
    Here is my setup: - win2003 server (ISA installed) with 3 NICs:   1) internal network   2) ISP 1 (default) network (DHCP enabled)   3) ISP 2 (backup) network (DHCP enabled) - several "normal" PC within internal net - one "special" PC within internal net Both ISP 1 and ISP 2 provide access to internet and their resources thru their VPN connections. The goal is to enable all "normal" PCs to use internet from ISP_1's VPN connection and "special" should use only ISP_2's VPN connection. Futhermore all "normal" and "special" PCs should have access to several servers accesible only thru ISP_2's VPN connection. I have some thoughts how to achieve this but I want to be certain because everything should be configured as quickly as posible, avoiding significant downtime. windows-server-2003 isa routing vpn

    Read the article

  • Hiding monitor from windows, working with it from my app only [closed]

    - by Mikhail
    I need to use a monitor as a "private" device for my special application, I want to use it as a flashlight of a sort and draw special patterns on it in full screen. I don't want this monitor to be recognized by OS (Windows 7) as a monitor. I.e. user should not be able to move mouse to that monitor, or change its resolution, or run screensaver on it or whatever. But I want to be able to interact with it from my application. Monitor is plugged using an HDMI cable to a video card (most probably nVidia). What is the simplest way to do this? All solutions are appreciated, including purchasing additional adapters or simple video cards, or any other special devices.

    Read the article

  • iPhone: how do I set up a clear window-size "blocker view"?

    - by Ben
    I feel like this should be obvious to me, but for some reason I can't figure this out. I have a navigation interface with nav bar, tool bar, and primary view. Sometimes the user takes an action that causes a progress indicator to appear in the middle of the view. While the progress indicator (which is a custom UIView) in spinning in the middle, I want no touch input to be allowed to go to any of the underlying interface (main view, nav bar, toolbar, etc). But this doesn't seem trivial. I've tried (and failed) to create a simple view whose only job is to swallow touch input and use it as a window subview-- no dice, it never gets the touch events (and yes, it does have userInteractionEnabled). I've tried to bolt it on as a transparent modal view controller, but those don't seem to ever be transparent. Thoughts? What am I missing? Thanks!

    Read the article

  • Determining Long Tap (Long Press, Tap Hold) on Android with jQuery

    - by Volomike
    I've been able to successfully play with the touchstart, touchmove, and touchend events on Android using jQuery and an HTML page. Now I'm trying to see what the trick is to determine a long tap event, where one taps and holds for 3 seconds. I can't seem to figure this out yet. I'm wanting to this purely in jQuery without Sencha Touch, JQTouch, jQMobile, etc. I like the concept of jQTouch, although it doesn't provide me a whole lot and some of my code breaks with it. With Sencha Touch, I'm not a fan of moving away from jQuery into Ext.js and some new way of doing Javascript abstraction, especially when jQuery is so capable. So, I want to figure this out with jQuery alone. I've been able to do many jQTouch and Sencha Touch things on my own using jQuery. And jQMobile is still too beta and not directed enough to the Android yet.

    Read the article

  • Interface Builder only allows one button to be "touchable"

    - by STLMikey
    I'm making a clone of the classic game simon, the memory matching game. My (iPad) app will load fine, i tap start, the game screen loads, and only one of my four buttons will respond to touch commands. To troubleshoot, I tried creating a second unrelated nib and just populating it with four buttons not linked to anything. However, only one of those four buttons would respond to touch! There are no IBActions being called, nothing. Both the view itself, and all four buttons are touch enabled in the inspector...I'm stumped. I'm moreso asking if anyone has encountered anything simillar, as I'd rather not burden potential help with app-specific questions if it's avoidable. Thank you!

    Read the article

  • How to have a UISwipeGestureRecognizer AND UIPanGestureRecognizer work on the same view

    - by Shizam
    How would you setup the gesture recognizers so that you could have a UISwipeGestureRecognizer and a UIPanGestureRecognizer work at the same time? Such that if you touch and move quickly (quick swipe) it detects the gesture as a swipe but if you touch then move (short delay between touch & move) it detects it as a pan? I've tried various permutations of requireGestureRecognizerToFail and that didn't help exactly, it made it so that if the SwipeGesture was left then my pan gesture would work up, down and right but any movement left was detected by the swipe gesture.

    Read the article

  • Excel transpose via paste

    - by David Oneill
    I want to transpose data in Excel. Normally, I cut the cells I need, and use paste special - transpose. However, sometimes when I do paste special, a box comes up asking me if I want to use unicode text vs normal text. How do I transpose this text? Is there a way to get past the unicode dialog box and get to the normal Paste special dialog box (that has the 'transpose' option)? Or, is there another simple way to transpose cells? transpose = flip rows and columns IE 1, 2, 3 becomes: 1 2 3

    Read the article

  • Is there any Opensource Browser for touchscreen device ?

    - by Wallah
    I need internet browser on my device which has 4.3 Inch screen with 480x272 resolution, I am using embedded Qt 4.6.2 on embedded linux. Micro-controller has ARM9 with 450 Mhz. Requirements for browser are - Touch Screen Support, Panning ( No Scroll bars) - Single touch Zooming ( No Multi Touch Available). - Fit to screen width support ( No Horizontal Scrolling). - Acid 3 Standard Compliable. - Page loading should be like, display all visible text first and then load and show Images Gradually. Is there any browser which is near to this requirements.

    Read the article

  • Context Menu for Browser to download file to specific folder

    - by elcojon
    There is this website which has customized audio files for me. I would like to save them in a special folder. Now I don't want to select the "special folder" each time in the file-chooser dialogue of my browser. I would rather prefer to have a custom entry in the context menu when I right-click the download link. This context-menu-entry should do the trick and download the file to the predefined "special folder". How would I start about that? I am using Safari and Chrome. So a solution in either browser is fine. To get into the context menu of the browser, what kind of programming do I need to do? Is it an extension, plugin, etc.? Thanks

    Read the article

  • Learning Objective-C 2.0 and ASP.NET 4.0 simultaneously?

    - by Sahat
    (HOBBY) I own a Macbook Pro and iPod Touch so developing iPhone/iPod/iPad apps seems like a logical thing to do in order to get some experience in the programming field. Besides I want to write a new application similar to the Capsuleer (Character skills monitor app for EVE Online MMO) but with more features. It's something I'd love to have on my own iPod Touch and I am sure other people will welcome a new EVE Online app for their iPhone or iPod Touch. (CAREER) I want to learn ASP.NET (and possibly Silverlight later on) for my potential future job. I plan to work in the .NET field, so it's a good idea for me to start learning C# and ASP.NET ASAP. Is it a good idea to learn completely unrelated technologies at the same time? Or would it be better to learn one thing at a time? Objective-C first, and ASP.NET second. Or vice versa. Thanks, Sahat

    Read the article

  • Lync client configured as room

    - by captainmish
    We have a few usb cameras in meeting rooms that people can plug their laptops into, which works ok, but we're looking for better... Probably a long shot, but does anyone know of a way to have something like a "common area" client, where a PC connected to a webcam and speakers/mic can become a bookable resource, dragged in to conversations and automatically show video? A workflow I imagine: User books a room with "special" lync client as attendee Meeting time comes, they go to the room and fire up lync on their pc "special" lync client automatically (or is dragged in) joins and starts video, local attendees use audio and video from the special client Any tips welcome!

    Read the article

  • Delete object[i] from table or group in corona sdk

    - by Rober Dote
    i have a problem (obviusly :P) i'm create a mini game, and when i touch a Object-A , creates an Object-B. If i touch N times, this create N Object-B. (Object-B are Bubbles in my game) so, i try when I touch the bubble (object-B), that disappears or perform any actions. I try adding Object-B to Array local t = {} . . . bur = display.newImage("burbuja.png") table.insert(t,bur) and where i have my eventListeners i wrote: for i=1, #t do bur[i]:addEventListener("tap",reventar(i)) end and my function 'reventar' local function reventar (event,id) table.remove(t,id) end i'm lost, and only i want disappears the bubbles.

    Read the article

  • Using Rails and Rspec, how do you test that the database is not touched by a method

    - by Will Tomlins
    So I'm writing a test for a method which for performance reasons should achieve what it needs to achieve without using SQL queries. I'm thinking all I need to know is what to stub: describe SomeModel do describe 'a_getter_method' do it 'should not touch the database' do thing = SomeModel.create something_inside_rails.should_not_receive(:a_method_querying_the_database) thing.a_getter_method end end end EDIT: to provide a more specific example: class Publication << ActiveRecord::Base end class Book << Publication end class Magazine << Publication end class Student << ActiveRecord::Base has_many :publications def publications_of_type(type) #this is the method I am trying to test. #The test should show that when I do the following, the database is queried. self.publications.find_all_by_type(type) end end describe Student do describe "publications_of_type" do it 'should not touch the database' do Student.create() student = Student.first(:include => :publications) #the publications relationship is already loaded, so no need to touch the DB lambda { student.publications_of_type(:magazine) }.should_not touch_the_database end end end So the test should fail in this example, because the rails 'find_all_by' method relies on SQL.

    Read the article

  • IPhone: different system timers??

    - by matt
    I have been using mach_absolute_time() for all my timing functions so far. calculating how long between frames ect. I now want to get the exact time touch input events happen using event.timestamp in the touch callbacks. the problem is these two seem to use completely different timers. sure, you can get them both in seconds, but their origins are different and seemingly random... is there any way to sync the two different timers? or is there anyway to get access to the same timer that the touch input uses to generate that timestamp property? otherwise its next to useless.

    Read the article

  • Using DNS entries to determine location

    - by Raphink
    I'm trying to think of a clean way to determine the location of machines (mainly, which datacenter they belong to) based on their network settings. I would like it to be dynamic, and I'm thinking of using special DNS records that would be specific to the DNS server in each datacenter. For example, you could have: root@machine1# dig TXT mysite ... mysite 3600 IN TXT "DC1" ... root@machine2# dig TXT mysite ... mysite 3600 IN TXT "DC2" ... etc. I know that DNS has a special LOC record for location, but it takes coordinates, so it doesn't help in my case. Is there a standard way of addressing this issue, another special type of record for it, or some standard entries in TXT records?

    Read the article

< Previous Page | 108 109 110 111 112 113 114 115 116 117 118 119  | Next Page >