Search Results

Search found 8929 results on 358 pages for 'multi touch'.

Page 34/358 | < Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >

  • Manually force touch points to reset in Windows 8?

    - by loyalpenguin
    Hi I developed a HTML5/JAVASCRIPT app that is supported using advertisement for the Win8 store. I just by chance happened to notice that if you touch the screen, drag your finger over the advertisement, and release your finger on top of the advertisement that the specific touch is not released and instead when you touch again it registers as a separate touch. This has caused my app to behave expectantly when the user interacts with the app using touch. I wanted to know if it was possible to force the touches to reset so that when the user touches the screen again it is always using "Touch(0)".

    Read the article

  • Touch and Drag from one view to another

    - by jollyCocoa
    Hi all! I've search for some clues on this problem without much success. Hope someone can kick me in the right direction. I am prototyping a couple of apps where I need to design my own GUI. The GUI is made up by two separated UIViews where one of them contains a small thumb of an image. I want to be able to drag this thumb from the first view to the other. Simple as that! But I haven't figured out how this is done. Here is the exact flow I am looking for: touch the thumb animate a small enlargement of the thumb drag the thumb to the other UIView drop the thumb animate a shrink of the thumb Not particularly strange, but the thumb remains related to the first view all the time. I've tried to move the thumb via the first views superview and then back to the second view, but with no luck.

    Read the article

  • How to highlight ListView item on touch?

    - by AndroidNoob
    I have a simple ListView and I want each of it items to be highlighted on user's touch. I thought this should happen by default but it isn't. Can you advice? ListView xml: <ListView android:id="@+id/list_view" android:layout_width="fill_parent" android:layout_height="fill_parent" android:padding="10dp" android:divider="#206600" android:dividerHeight="2dp" android:smoothScrollbar="true" android:background="#ffffff" > </ListView> And code of my Adapter: private class MyAdapter extends ArrayAdapter<Task> { private LayoutInflater mInflater; public MyAdapter(Context context, int resource, List<Task> list) { super(context, resource, list); mInflater = LayoutInflater.from(context); } @Override public View getView(int position, View convertView, ViewGroup parent) { View v = convertView; if (v == null) { v = mInflater.inflate(R.layout.list_item, null); } Task task = taskList.get(position); /* Setup views from your layout using data in Object here */ return v; }

    Read the article

  • Detecting touch area on Android

    - by HappyAppDeveloper
    Is it possible to detect every pixel being touched? More specifically, when the user touches the screen, is it possible to track all the x-y coordinates of the cluster of points touched by the user? How can I tell the difference between when users are drawing with their thumb and when they are drawing with the tip of a finger? I would like to reflect the brush difference depending on how users touch the screen, and would also like to track x-y coordinates of all the pixels being touched over time. Thanks so much in advance for any help.

    Read the article

  • How much market shares OpenGL2.0 in iPhone os hardwares(iPhone/iPot Touch)

    - by Eonil
    I'm planning making a game for AppStore, so I'm studying GLES. But, GLES 1.1 and 2.0 APIs are different about handling in some features.(and limitations) I have not enough time to consider both of them, I have to choosing one. 2.0 is clearly better in developer's view, but I'm worry about it's market share. I wish most users moved on newer SGX based hardware, but in fact, I don't know. Does anybody have information about location of those hardware ratio data in iPhone OS supported hardwares? (iPhone/iPod touch, per GPU) Please let me know.

    Read the article

  • Get the co-ordinates of a touch event on Android

    - by Joe
    Hi, I'm new to Android, I've followed the hello world tutorial through and have a basic idea of what's going on. I'm particularly interested in the touch screen of my T-Mobile Pulse so just to get me started I want to be able to write the co-ordinates of a tocuh event on the screen, so say the user touched the co-ordinate 5,2 - a textview on the screen would display that. At present I have a simple program that just loads an xml file which contains the textview I intend to write the co-ordinates in. Thank you in advance, I did Google for help and searched stackoverflow but everything I found either went way over my head or wasn't suitable for this. Cheers.

    Read the article

  • Making interactive touch objects on Android

    - by Greenhouse_Gases
    I've never built a game before, and I've not programmed for Android before but am looking to do so over the summer by building a game. What type of object do I use for a shape that I want the user to be able to drag around the screen for instance using touch gestures? How do I tie together the MotionEvent, View and Graphics2D to make objects drawn on screen that can be interacted with? I imagine this will use ActionListeners / Handlers but I'm a bit confused at this stage... A simple breakdown of steps would be much appreciated. Thanks

    Read the article

  • Migrating Windows XP BOOT.INI Settings to Windows 7 Boot-loader

    - by Synetech inc.
    Two months ago my motherboard died, so I bought a used computer that came with Windows 7. I have since installed my old hard-drive, which had Windows XP on it, in this system. What I am trying to do now is to figure out a way to migrate the settings from XP's BOOT.INI into 7's boot-loader. Below is the BOOT.INI I used in XP (I have reduced the strings and updated the disks to point to the new location of the old HD. Oh and I am not clear on the drive letters. In XP, I could boot the recovery console or MS-DOS from a file in C:\ that contains the boot-sector. I am not sure what drive letter it would be called now—I had to manually change all the drive letters of the old partitions in Windows 7 because it auto-assigned them all wrong/differently). [boot loader] timeout=10 default=multi(0)disk(0)rdisk(1)partition(1)\WINDOWS [operating systems] multi(0)disk(0)rdisk(1)partition(1)\WINDOWS="XP" /fastdetect multi(0)disk(0)rdisk(1)partition(1)\WINDOWS="XP (Safe)" /safeboot:network /sos /bootlog /noguiboot C:\CMDCONS\BOOTSECT.DAT="Recovery Console" /cmdcons C:\BOOTSECT.DOS="MS-DOS 7.10" /win95 I have looked around, and have only been able to find some bcdedit commands to add XP to the boot-loader, but none that include information on setting safe-mode for it (or changing any of the XP load options for that matter). Not surprisingly I suppose, I have not found anything on adding the XP recovery console or DOS to the Windows 7 boot-loader. (Yes, I tried EasyBCD, but that did not help; it had no options for XP, and the best I managed was to get a choice of booting 7 or normal-mode XP—choosing XP didn't even give the old XP boot menu.) Can anyone please tell me how to export the entries in XP's boot.ini to 7's boot-loader so that on boot, I can choose to load the following: Windows 7 Windows 7 (Safe-mode) (Windows 7 (The Win7 counterpart of the Recovery Console)) Windows XP Windows XP (Safe-mode) Windows XP (Recovery Console) MS-DOS 7.10

    Read the article

  • Migrating Windows XP BOOT.INI Settings to Windows 7 Boot-loader

    - by Synetech inc.
    Hi, Two months ago my motherboard died, so I bought a used computer that came with Windows 7. I have since installed my old hard-drive, which had Windows XP on it, in this system. What I am trying to do now is to figure out a way to migrate the settings from XP's BOOT.INI into 7's boot-loader. Below is the BOOT.INI I used in XP (I have reduced the strings and updated the disks to point to the new location of the old HD. Oh and I am not clear on the drive letters. In XP, I could boot the recovery console or MS-DOS from a file in C:\ that contains the boot-sector. I am not sure what drive letter it would be called now—I had to manually change all the drive letters of the old partitions in Windows 7 because it auto-assigned them all wrong/differently). [boot loader] timeout=10 default=multi(0)disk(0)rdisk(1)partition(1)\WINDOWS [operating systems] multi(0)disk(0)rdisk(1)partition(1)\WINDOWS="XP" /fastdetect multi(0)disk(0)rdisk(1)partition(1)\WINDOWS="XP (Safe)" /safeboot:network /sos /bootlog /noguiboot C:\CMDCONS\BOOTSECT.DAT="Recovery Console" /cmdcons C:\BOOTSECT.DOS="MS-DOS 7.10" /win95 I have looked around, and have only been able to find some bcdedit commands to add XP to the boot-loader, but none that include information on setting safe-mode for it (or changing any of the XP load options for that matter). Not surprisingly I suppose, I have not found anything on adding the XP recovery console or DOS to the Windows 7 boot-loader. (Yes, I tried EasyBCD, but that did not help; it had no options for XP, and the best I managed was to get a choice of booting 7 or normal-mode XP—choosing XP didn't even give the old XP boot menu.) Can anyone please tell me how to export the entries in XP's boot.ini to 7's boot-loader so that on boot, I can choose to load the following: Windows 7 Windows 7 (Safe-mode) (Windows 7 (The Win7 counterpart of the Recovery Console)) Windows XP Windows XP (Safe-mode) Windows XP (Recovery Console) MS-DOS 7.10

    Read the article

  • Google et Blink tournent le dos au W3C et à Pointer Events de Microsoft, au profit de Touch Events d'Apple ?

    Google et Blink tournent le dos au W3C et à Pointer Events de Microsoft au profit de Touch Events d'Apple ? Google et son moteur de rendu Web Blink ont finalement tranché en défaveur du standard du W3C, en effet à travers un bref communiqué sur la plateforme de développement de Blink, Google vient d'annoncer l'abandon de l'API Pointer Events, jusqu'ici présentée comme le futur standard du W3C en remplacement de Touch Events.Pour rappel Blink est le fork du célèbre moteur de rendu web Webkit actuellement...

    Read the article

  • Ubuntu 10.04 LTS - Dual monitor works only sometimes (ATI multi-desktop)

    - by Beres Botond
    I've been using my laptop with an external LCD monitor attached to it at work (Philips 201E). And at home with a different external monitor (Samsung 2032BW). I have an ATI graphics card (HD3450), with Ati Catalyst drivers enabled and I'm using the Single display desktop (Multi-Desktop) seeting. At work I have the external monitor on the left and laptop on the right, while at home the other way around. So when I switch between the two setups, I just needed to go to Ati Catalyst Control Center, change the order of the displays, change the resolution (Home - 1680x1050, Work - 1440x900), reboot and it was all fine. But since a while it doesn't work properly anymore: At home it still works fine. At work it doesn't work. Sometimes it works for some reason, after a few resolution/setting changes in ACCC and reboots... it's very strange and annoying. With the home monitor I can see the whole bootup process on both monitors (laptop + LCD) and it always just works fine. With the work monitor on the external LCD monitor I just see "No video input" until I get to the login screen, then it shows up there as well. But after login it will either: Flicker a few times, but then work OK. Or (more often) Flicker once and then go back to "No video input" again. I usually end up rebooting a few times until it works. Does anyone have any idea for fixing it? This is my xorg.conf currently: Section "ServerLayout" Identifier "amdcccle Layout" Screen 0 "amdcccle-Screen[6]-0" 0 0 Screen "amdcccle-Screen[6]-1" 1280 0 EndSection Section "Files" EndSection Section "Module" Load "glx" EndSection Section "ServerFlags" Option "Xinerama" "off" EndSection Section "Monitor" Identifier "0-LVDS" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" Option "PreferredMode" "1280x768" Option "TargetRefresh" "60" Option "Position" "0 0" Option "Rotate" "normal" Option "Disable" "false" EndSection Section "Monitor" Identifier "0-CRT1" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" Option "TargetRefresh" "60" Option "Position" "0 0" Option "Rotate" "normal" Option "Disable" "false" Option "PreferredMode" "1440x900" EndSection Section "Device" Identifier "Default Device" Driver "fglrx" EndSection Section "Device" Identifier "amdcccle-Device[6]-0" Driver "fglrx" Option "Monitor-LVDS" "0-LVDS" BusID "PCI:6:0:0" EndSection Section "Device" Identifier "amdcccle-Device[6]-1" Driver "fglrx" Option "Monitor-CRT1" "0-CRT1" BusID "PCI:6:0:0" Screen 1 EndSection Section "Screen" Identifier "Default Screen" DefaultDepth 24 SubSection "Display" Virtual 2560 1024 EndSubSection EndSection Section "Screen" Identifier "amdcccle-Screen[6]-0" Device "amdcccle-Device[6]-0" DefaultDepth 24 SubSection "Display" Viewport 0 0 Depth 24 EndSubSection EndSection Section "Screen" Identifier "amdcccle-Screen[6]-1" Device "amdcccle-Device[6]-1" DefaultDepth 24 SubSection "Display" Viewport 0 0 Depth 24 EndSubSection EndSection

    Read the article

  • Socks proxy on mac's shared internet

    - by AliBZ
    Hi all I use my mac's internet sharing to create wireless network for my ipod touch. I have a linux server and I use socks proxy. I wanna use this proxy on my ipod but i don't know how. I put my shared network connection behind the proxy with localhost ip but my ipod isn't behind the proxy. any ideas?

    Read the article

  • Using Foobar to manage an iPod

    - by codeulike
    I see there are at least one or two add-ons for Foobar that let you use it to manage Music on an iPod. Which would you recommend? (Am interested in linking to the iPod Nano 5th Gen, and maybe also iPod Touch 2nd Gen, but thats not so important)

    Read the article

  • iPod notification alarm

    - by monkies
    Is there an app that can turn a notification into an alarm? With my iPod Touch i don't always hear notifications and if I don't hear it the first time it will not remind me again. Is there any way to have the notifications bug me until I turn it off? Something like the alarm they have on the standard iPod.

    Read the article

  • iTunes Over the Air Sync

    - by aceinthehole
    Is there any software or hack in existence that will allow iTunes to sync wirelessly with my iPhone or iPod touch? I'd like the iPhone to be constantly synced without having to plug it into the USB at my computer via the 802.11 connection, or even better I would like it to happen over 3G when I am not at home. I'd heard that is might be possible (albeit slow) but have not been able to find any software or specific steps out anywhere that lets you do it.

    Read the article

  • iPhone/IPod App Desktop Emulator

    - by Bill Sevil
    I want to sell my iPod Touch and buy an Andriod-based phone. However, there are some apps that I have paid for that are only from the iTunes app store where there is no good alternative on Andriod systems (eg. language learning apps with thousands of words, "references" applications). Is there a program to emulate the apps that I have already purchased (the ones in my \iTunes\iTunes Media\Mobile Applications directory) and play them on my desktop?

    Read the article

  • Cancel UITouch Events When View Covered By Modal UIViewController

    - by kkrizka
    Hi there, I am writing an application where the user has to move some stuff on the screen using his fingers and drop them. To do this, I am using the touchesBegan,touchesEnded... function of each view that has to be moved. The problem is that sometimes the views are covered by a view displayed using the [UIViewController presentModalViewController] function. As soon as that happens, the UIView that I was moving stops receiving the touch events, since it was covered up. But there is no event telling me that it stopped receiving the events, so I can reset the state of the moved view. The following is an example that demonstrates this. The functions are part of a UIView that is being shown in the main window. It listens to touch events and when I drag the finger for some distance, it presents a modal view that covers everything. In the Run Log, it prints what touch events are received. - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchesBegan"); touchStart=[[touches anyObject] locationInView:self]; } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { CGPoint touchAt=[[touches anyObject] locationInView:self]; float xx=(touchAt.x-touchStart.x)*(touchAt.x-touchStart.x); float yy=(touchAt.y-touchStart.y)*(touchAt.y-touchStart.y); float rr=xx+yy; NSLog(@"touchesMoved %f",rr); if(rr > 100) { NSLog(@"Show modal"); [viewController presentModalViewController:[UIViewController new] animated:NO]; } } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchesEnded"); } - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchesCancelled"); } But when I test the application and trigger the modal dialog to be displayed, the following is the output in the Run Log. [Session started at 2010-03-27 16:17:14 -0700.] 2010-03-27 16:17:18.831 modelTouchCancel[2594:207] touchesBegan 2010-03-27 16:17:19.485 modelTouchCancel[2594:207] touchesMoved 2.000000 2010-03-27 16:17:19.504 modelTouchCancel[2594:207] touchesMoved 4.000000 2010-03-27 16:17:19.523 modelTouchCancel[2594:207] touchesMoved 16.000000 2010-03-27 16:17:19.538 modelTouchCancel[2594:207] touchesMoved 26.000000 2010-03-27 16:17:19.596 modelTouchCancel[2594:207] touchesMoved 68.000000 2010-03-27 16:17:19.624 modelTouchCancel[2594:207] touchesMoved 85.000000 2010-03-27 16:17:19.640 modelTouchCancel[2594:207] touchesMoved 125.000000 2010-03-27 16:17:19.641 modelTouchCancel[2594:207] Show modal Any suggestions on how to reset the state of a UIView when its touch events are interrupted by a modal view?

    Read the article

  • where did all the templates in xcode go?

    - by Ayrad
    After the recent update in xcode I seem to have lost all the template for cocoa touch and iphone templates. Under Cocoa Touch Classes in the new file dialog, I only have 3 choices: Objective-C class Objective-C test case class and UIViewController subclass where did the others go? UITableView, UInavigation etc.? I am running xcode 3.1.3. Thanks!

    Read the article

  • Create a screen like the iPhone home screen with Scrollview and Buttons

    - by Anthony Chan
    Hi, I'm working on a project and need to create a screen similar to the iPhone home screen: A scrollview with multiple pages A bunch of icons When not in edit mode, swipe through different pages (even I started the touch on an icon) When not in edit mode, tap an icon to do something When in edit mode, drag the icon to swap places, and even swap to different pages When in edit mode, tap an icon to remove it Previously I read from several forums that I have to subclass UIScrollview in order to have touch input for the UIViews on top of it. So I subclassed it overriding the methods to handle touches: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { //If not dragging, send event to next responder if (!self.dragging) [self.nextResponder touchesBegan:touches withEvent:event]; else [super touchesBegan:touches withEvent:event]; } In general I've override the touchesBegan:, touchesMoved: and touchesEnded: methods similarly. Then in the view controller, I added to following code: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; UIView *hitView = (UIView *)touch.view; if ([hitView isKindOfClass:[UIView class]]) { [hitView doSomething]; NSLog(@"touchesBegan"); } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { // Some codes to move the icons NSLog(@"touchesMoved"); } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchesEnded"); } When I run the app, I have the touchesBegan method detected correctly. However, when I tried to drag the icon, the icon just moved a tiny bit and then the page started to scroll. In console, it logged with 2 or 3 "touchesMoved" message only. However, I learned from another project that it should logged tonnes of "touchesMoved" message as long as I'm still dragging on the screen. (I'm suspecting I have the delaysContentTouches set to YES, so it delays a little bit when I tried to drag the icons. After that minor delay, it sends to signal back to the scrollview to scroll through the page. Please correct me if I'm wrong.) So if any help on the code to perform the above tasks would be greatly appreciated. I've stuck in this place for nearly a week with no hope. Thanks a lot.

    Read the article

  • Detecting Touches in an OpenGL rendered scene

    - by Icky
    Hey. I was wondering whether there is a way to detect a touch in an OpenGL rendered scene. What I have i a set of images which are being rendered in my main view. Now if the user touches one of these images (or objects) I would like to know which one was touched - similar to the CGRectContainsPoint(frame, [touch locationInView:self.view] method. Is there an easy way to find out? If there is none, this would also help.

    Read the article

  • Looking for marg_setValue in UIKit

    - by John Smith
    I am trying to compile a library originally written for Cocoa. Things are good until it looks for the function marg_setValue(). It says it can't find it. I have googled and found it is defined in How can I use this file in cocoa-touch? Or does cocoa-touch not support runtime.

    Read the article

  • how to use store define in store folder

    - by Kevin Morfin
    I'm new to sencha-touch. I was wondering how to properly use the file structure in sencha-touch. For example, under the app folder there's your controller, model, profile, store, view folders. If I define a store, for example under the the store folder I create a file named search.js Ext.define('Volunteer.store.search'{ extend: 'Ext.data.Store', requires: ['Volunteer.model.person'], config:{ model: 'Volunteer.model.person' } }); How do I use this store in a different file?

    Read the article

  • Xcode: How to change Image on Touching at an Image (same Position)?

    - by Markus S.
    Simply, I have placed an image with the Interface Builder in a UIView. Now I want to run a method/function when I am touching that image, for example when another image should be loaded. My Code: - (void)touchesended:(NSSet*) touches withEvent:(UIEvent *)event { // UITouch *touch = [touch anyObject]; bild_1_1.image = [UIImage imageNamed:@"t_1_4.jpg"]; //} }

    Read the article

  • Portable scripting language for a multi-server admin?

    - by Aaron
    Please Note: Portable as in portableapps.com, not the traditional definition. Originally posted on stackoverflow.com, asking here at another user's suggestion. I'm a DBA and sysadmin, mostly for Windows machines running SQL Server. I'm looking for a programming/scripting language for Windows that doesn't require Admin access or an installer, needing no install process other than expanding it into a folder. My intent is to have a language for automation on which I can standardize. Up to this point, I've been using a combination of batch files and Unix shell, using sh.exe from UnxUtils but it's far from a perfect solution. I've evaluated a handful of options, all of them have at least one serious shortcoming or another. I have a strong preference for something open source or dual license, but I'm more interested in finding the right tool than anything else. Not interested that anything that relies on Cygwin or Java, but at this point I'd be fine with something that needs .NET. Requirements: Manageable footprint (1-100 files, under 30 MB installed) Run on Windows XP and Server (2003+) No installer (exe, msi) Works with external pipes, processes, and files Support for MS SQL Server or ODBC connections Bonus Points: Open Source FFI for calling functions in native DLLs GUI support (native or gtk, wx, fltk, etc) Linux, AIX, and/or OS X support Dynamic, object oriented and/or functional, interpreted or bytecode compiled; interactive development Able to package or compile scripts into executables So far I've tried: Ruby: 148 MB on disk, 23000 files Portable Python: 54 MB on disk, 2800 files Strawberry Perl: 123 MB on disk, 3600 files REBOL: Great, except closed source and no MSSQL or ODBC in free version Squeak Smalltalk: Great, except poor support for scripting ---- cut: points of clarification ---- Why all the limitations? I realize some of my criteria seem arbitrarily confining. It's primarily a product my environment. I work as a SQL Server DBA and backup Unix admin at a division of a large company. In addition to near a hundred boxes running some version or another of SQL Server on Windows, I also support the SQL Server Express Edition installs on over a thousand machines in the field. Because of our security policies, I don't login rights on every machine. Often enough, an issue comes up and I'm given local Admin for some period of time. Often enough, it's some box I've never touched and don't have my own environment setup yet. I may have temporary admin rights on the box, but I'm not the admin for the machine- I'm just the DBA. I've no interest in stepping on the toes of the Windows admins, nor do I want to take over any of their duties. If I bring up "installing" something, suddenly it becomes a matter of interest for Production Control and the Windows admins; if I'm copying up a script, no one minds. The distinction may not mean much to the readers, but if someone gets the wrong idea I've suddenly got a long wait and significant overhead before I can get the tool installed and get the problem solved. That's why I want something that can be copied and run in the manner of a portable app. What about the small footprint? My company has three divisions, each in a different geographical location, and one of them is a new acquisition. We have different production control/security policies in each division. I support our MSSQL databases in all three divisions. The field machines are spread around the US, sometimes connecting to the VPN over very slow links. Installing Ruby \using psexec has taken a long time over these connections. In these instances, the bigger time waster seems to be archives with thousands and thousands of files rather than their sheer size. You could say I'm spoiled by Unix, where the admins usually have at least some modern scripting language installed; I'd use PowerShell, but I don't know it well and more importantly it isn't everywhere I need to work. It's a regular occurrence that I need to write, deploy and execute some script on short notice on some machine I've never on which logged in. Since having Ruby or something similar installed on every machine I'll ever need to touch is effectively impossible because of the approvals, time and and Windows admin labor needed I makes more sense find a solution that allows me to work on my own terms.

    Read the article

  • Blackberry storm 9530 tracing touch events while scrolling

    - by SWATI
    hey in my screen there is a an edit field and 2 custom button fields as "OK" and "CANCEL" Below buttonfield there are some more focussable label fields when i write a name in edit field and press enter then focus comes to "OK" button but how to set focus on "CANCEL" button. Moreover while scrolling the focus does not automatically move ahead??? what to do may be i m confused with touch events and their handling!!! Kindly help!!!!!!!!!!!! Code: txt_Name = new EditField(TextField.NO_NEWLINE) { public void paint(net.rim.device.api.ui.Graphics g) { g.setColor(Color.MAROON); super.paint(g); } }; txt_Name.setFont(font); v1 = new VerticalFieldManager(); v1.add(txt_Name ); ButtonField btn1 = new ButtonField("OK",ButtonField.CONSUME_CLICK); ButtonField btn2 = new ButtonField("CANCEL",ButtonField.CONSUME_CLICK); v2 = new VerticalFieldManager(); v2.add(btn1); v2.add(btn2); LabelField l1 = new LabelField("Hello Moon ",Field.Focussable); LabelField l2 = new LabelField("Hello Citizen",Field.Focussable); LabelField l3 = new LabelField("Hello People",Field.Focussable); LabelField l4 = new LabelField("Hello world",Field.Focussable); v3 = new VerticalFieldManager(); v3.add(l1); v3.add(l2); v3.add(l3); v3.add(l4); add(v1); add(v2); add(v3); } protected boolean navigationClick(int status, int time) { if(OK.isFocus()) { //execute some code return true; } if(CANCEL.isFocus()) { //execute some code return true; } }

    Read the article

< Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >