Search Results

Search found 176 results on 8 pages for 'touchscreen'.

Page 3/8 | < Previous Page | 1 2 3 4 5 6 7 8  | Next Page >

  • Change the on-screen keyboard key size?

    - by stukelly
    Is it possible to change the key size on the on-screen keyboard included with Windows XP? I find it really difficult to use on a touch screen, when there is no keyboard available. I know you can the change the font size, but this does not change the key size.

    Read the article

  • How do I enable in-place tablet pc input panel for non tablet PCs?

    - by yngvedh
    Hi all, Is it possible to enable the in-place tablet PC input panel on a non-tablet PC? I have checked the "For tablet pen input, show the Input Panel icon next to the text entry area when possible" checkbox in the options of the Input Panel. Does this not work because pen input is something different than mouse input? I do have a touch screen, but it just emulates a mouse (moving the cursor, pressing the left mouse button and such). I can get the Input Panel to show manually by starting tabtip.exe and then event the ink works, but I cannot get it to show (itself or it's in-place icon) when I activate text input controls. Does anyone know what's up?

    Read the article

  • Anti-glare filter for touch enabled Netbook?

    - by chris
    Is it possible to put an anti-glare filter on a touch enabled Netbook without disabling the touch functionality? If yes, would it also be possible to remove the filter without damaging the screen? In my case it's an Acer 1420p (the PDC edition). It's a good machine but unfortunately you can also use it as a mirror.

    Read the article

  • Touch Screen Ubuntu 10.04LTS

    - by WalterJ89
    I'm trying to get a touch screen working with Ubuntu 10.04LTS (64bit) -it is a serial touchsceen, connected at /dev/ttyS0 ,i know that works because I get garbage in the terminal when I enable it. -before the screen used a 3m driver (I believe) in XP. My knowledge of Linux is passive so I generally pick up something when I need it. To get this working I came accross a lot of tutorials (a lot outdated a bit), I'm still at a loss to get this work. I'm not sure where to put linux drivers (/usr/ or /dev/?) most tutorials kind of skip over that part. I have tried editing the /etc/X11/xorg.conf unsuccessfully. I'm not sure what the syntax for that is supposed to be. Thank You

    Read the article

  • How can I get Windows 8 to automatically disable touch when I am using my Wacom pen and turn it back on when I am not

    - by Robert
    I have an HP convertible tablet computer which I just upgraded to Windows 8. The problem (which existed under Windows 7 as well) is that this tablet has both a capacitive touch screen (with multi-touch) AND a wacom-type tablet built in to the screen that works using electro-magnetic resonance with the provided stylus. My Use Case: Most of the time I am happy using my fingers and the touch interface for navigation and whatnot. However, when I want to get down to serious note-taking/drawing, I want to use the wacom functionality. The problem is that any comfortable writing position has me resting my arm/hand on the screen, which activates the touch technology (despite supposed palm-detection algorithms) and completely screws up my input paradigm. My Ideal Solution: Ideallly, since wacom technology senses when the pen is "close" to the screen, I would love to have touch be automatically disabled whenever the wacom pen is detected, and turned back on when it is out of range. this would allow me to seamless switch between the two input methods, and since I NEVER want to use both at once would work perfectly for me. An acceptable alternative: As a next best option, It would be great to be able to turn off the touch functionality (leaving the wacom in place) whenever I entered specific apps (e.g. OneNote, Photoshop, Gimp, Pencil, etc.) and then have it turn back on when I left that app.... As a worst case at least lets me use my PC option: If I could create a shortcut (tile or otherwise) that flips the touch on and off without going all the way through the nested computer settings, that would be better than nothing. Thanks in advance for the help with 1 or more of the above.

    Read the article

  • Dual Touchscreens in Different Rooms

    - by Ash
    I'm planning on having one dual-core system running two touchscreens, each in its own room in the house. I'd like to be able to use the internet on one, while someone else uses the other to record music - each of us interacting with the screen as we would separate computers. I was also thinking that running each screen, or certain programs, on its own core might make this work more smoothly. Will this setup work in Windows 7 Home on a mini-tower or do I need to invest in a server to get this sort of workstation/terminal setup to work?

    Read the article

  • What cable would be used for a touch screen

    - by George Bailey
    I was told that any monitor could be turned into a touch screen if you have the right software. This has got to be old news, or even a myth. Please shed some light on this if you can. Am I wrong? My primary question is which cable would normally be used to hook up a touch screen. Would you need 2 cables for this (one for the monitor, one to receive touch events?), or is a single cable going to have data in both directions (perhaps an HDMI?)

    Read the article

  • How can I tell which laptop touch-screens work well with a stylus (for drawing/taking notes)?

    - by BlueRaja
    I'm looking for a laptop with a touch-screen and stylus for drawing/note-taking. I've read the difference between the different kinds of styluses, but that's only half the story - what about the touch-screen? How do I know if the touch-screen supports "palm-rejection"? Or if the included stylus is a capacitive stylus or a "Wacom digitizer"? Or if the screen will even support Wacom? How can I tell how accurate the touch-screen is (from my testing, some definitely seem to have higher "resolution" than others)? Is there anything else I should be looking at? I don't see any of this information on, for instance, the Newegg specs page for a laptop.

    Read the article

  • Wireless touch screen monitor

    - by delux247
    Does anyone make a wireless touch screen monitor that would be could be used kind of like a tablet pc? Basically I want something that could sit on my lap and allow me to view and control a nearby pc. UPDATE: Anyone know of a touch screen tablet pc without a keyboard, so it's just a screen?

    Read the article

  • Control Windows 8 with a tablet

    - by Frantumn
    It seems much of Windows 8 is based on the idea that it will be running on tablets, and touch screen PCs. I like this, but I don't have a W89 tablet, or a touch PC yet. I'm running W8 on a laptop, and am wondering if there's any way of using my iPad2 as a touch interface with metro? Or another options that'd be nice is if there's a input device similar to Apple's "Magic Trackpad" that would allow me to use hand gestures instead of a mouse cursor when in metro apps. I've seen some cool videos of MS Smart Glass and it would seem that the capabilities are there. But it may just be too early on to do this? I'm not sure.

    Read the article

  • Touch screens for kiosk applications

    - by Micah
    I'm developing a kiosk-style touchscreen application in Qt. Currently I'm using an Elo Touch surface acoustic wave touchmonitor which works well except for one thing: drag performance is way too poor to provide a good user experience. As this is the case for the cursor in X as well as in my application, it seems to be either the fault of X (probably not) or the touchmonitor. Since mobile platforms are able to achieve very high performance in this regard, it seems like it should be possible for vastly more powerful desktop systems. Does anybody have experience with getting good drag performance out of desktop touchmonitors? What hardware have you used? Is X to blame?

    Read the article

  • Is there a way to simulate gestures for non-touchscreen users?

    - by m-y
    How are users able to actually use features such as pinching if they don't have touchscreen monitors? I ask this because: I plan to use Windows 8 on my desktop, and I want to get the full use of any applications I download. I want to ensure that if I ever release any application I develop to the windows store that there is a way for my users to get around this situation (no touchscreen for multitouch events).

    Read the article

  • `make install` fails apparently due to typo, but not in makefile: How to find and fix?

    - by Archelon
    I'm trying to install the fujitsu-usb-touchscreen drivers from here, on Kubuntu 12.04 on my new Fujitsu LifeBook P1630. (See fujitsu-usb-touchscreen on kubuntu 13.04 (64-bit) on P1630: `make` errors.) I downloaded the .zip file, unzipped it, and ran make in the directory thus created; this all worked as expected. However, when I run sudo checkinstall (which invokes make install), things go less well. On the first attempt the installation aborted with the following error: make: execvp: /etc/init.d/fujitsu_touchscreen: Permission denied make: *** [install] Error 127 I eventually resolved this by $ sudo chmod +x /etc/init.d/fujitsu_touchscreen But although a second sudo checkinstall then does not give the execvp error, it still fails at a later stage, and the log (on stdout) shows this dpkg error: dpkg: error processing /home/archelon/fujitsu-touchscreen-driver/cybergene-fujitsu-usb-touchscreen-112fdb75b406/cybergene-fujitsu-usb-touchscreen-112fdb75b406_amd64.deb (--install): unable to create `/sys/module/fujitsu/usb/touchscreen/parameters/touch_maxy.dpkg-new' (while processing `/sys/module/fujitsu/usb/touchscreen/parameters/touch_maxy'): No such file or directory And, indeed, there is no /sys/module/fujitsu/usb/touchscreen/parameters/touch_maxy; there is, however, /sys/module/fujitsu_usb_touchscreen/parameters/touch_maxy, and this is presumably what was intended. But this incorrect filename does not appear in the makefile or any other file in the directory, at least not that I can find. Nor does it appear, as I discovered after running sudo checkinstall --install=no as suggested below, in the .deb package created by checkinstall. Where might such a typographical error be originating, and how would I go about fixing it? Edited to add: I'm viewing the contents of the .deb file with ark, Kubuntu's default tool. It contains only three files: control.tar.gz, data.tar.gz, and debian-binary. data.tar.gz contains the directory tree that appears to match up to the usual root filesystem, with /etc, /lib, /sys, and /usr directories. (Looking at other .deb files on my system, this structure appears to be typical.) Here's a screenshot: . (Full size.) Here's another screenshot showing that control.tar.gz contains three files, one of which is empty: . (Full size.) Here's the actual .deb file: https://www.dropbox.com/s/odwxxez0fhyvg7a/cybergene-fujitsu-usb-touchscreen_112fdb75b406-1_amd64.deb Edited 2013-09-28 to add: After reinstalling Kubuntu 12.04 again, this time recreating the /home partition (which, again, had been generated during an install of 13.04), I can no longer reproduce this error. I am still curious to know how the underscores got changed to slashes, but it looks as though nobody has any idea. It is perhaps also of interest to note that while I have still not successfully run checkinstall against this package, I have done make install; it requires the executabilization of /etc/init.d/fujitsu_touchscreen and the installation of hal, and the GUI freezes shortly after installation completes, and there is no particular new functionality afterwards that I have noticed, and the system can no longer resume from being suspended; however, this will be pursued elsewhere.

    Read the article

  • UISegmentedControl tint color on touch

    - by gotye
    Hey everyone, I have a UISegmentedControl in my app (see code below) : // --------------- SETTING NAVIGATION BAR RIGHT BUTTONS NSArray *segControlItems = [NSArray arrayWithObjects:[UIImage imageNamed:@"up.png"],[UIImage imageNamed:@"down.png"], nil]; segControl = [[UISegmentedControl alloc] initWithItems:segControlItems]; segControl.segmentedControlStyle = UISegmentedControlStyleBar; segControl.momentary = YES; segControl.frame = CGRectMake(25.0, 7, 65.0, 30.0); segControl.tintColor = [UIColor blackColor]; [segControl addTarget:self action:@selector(segAction:) forControlEvents:UIControlEventValueChanged]; if (current == 0) [segControl setEnabled:NO forSegmentAtIndex:0]; if (current == ([news count]-1)) [segControl setEnabled:NO forSegmentAtIndex:1]; // --------------- But I can't make it to show something when you click on it ... It functionnally works perfectly but I would like it to tint to gray when you click on it (but just when you click) ... would that be possible ? Thank you, Gotye.

    Read the article

  • Touch friendly GUI in Windows Mobile

    - by vonolsson
    I'm porting an audio processing application written in C++ from Windows to Windows Mobile (version 5+). Basically what I need to port is the GUI. The application is quite complicated and the GUI will need to be able to offer a lot of functionality. I would like to create a touch friendly user interface that also looks good. Which basically means that standard WinMo controls are out the window. I've looked at libraries such as Fluid and they look like something I would like to use. However, as I said I'm developing i C++. Even though it would be possible to only write the GUI part i some .NET language I rather not. My experience with .NET on Windows Mobile is that it doesn't work very well... Can anyone either suggest a C/C++ touch friendly GUI library for Windows Mobile or some kind of "best practices" document/how-to on how to use the standard Windows Mobile controls in order to make the touch friendly and also work and look well in later versions of Windows Mobile (in particular version 6.5)?

    Read the article

  • A problem of trying to implement scrolling inertia with jQuery

    - by gargantaun
    I'm trying to add some iPhone style scrolling inertia to a web page that will only be viewed on the iPad. I have the scrolling working in one direction (scrollLeft), but it doesn't work in the other direction. It's a pretty simple function function onTouchEnd(event){ event.preventDefault(); inertia = (oldMoveX - touchMoveX); // Inertia Stuff if( Math.abs(inertia) > 10 ){ $("#feedback").html(inertia); $("#container").animate({ 'scrollLeft': $("#container").scrollLeft() + (inertia * 10) }, inertia * 20); }else{ $("#feedback").html("No Inertia"); } } I've bound it to the 'touchend' event on the body. The intertia is the difference betweent he old moveX position and the latest moveX position when a touch ends. I then try to animate the scrollLeft property of a div that contains a bunch of thumbnails. As I've said, this works when scrolling to the left, but not when scrolling to the right. You can view the full source code (all in one page) or test it on your iPhone or iPad (or in the simulator) here http://www.appliedworks.co.uk/files/times/swipegal.html Any ideas?

    Read the article

  • Technology to communicate with someone with expressive aphasia?

    - by rascher
    A family member had a stroke a few years back and now has expressive aphasia. She understands what is said to her, is cognitive of what is going on, but cannot express herself. She is able to respond to yes/no questions (do you want to go shopping? are you looking for your earrings?) She is not, however, able to read (English is not her native language and she hasn't read Hindi for decades.) I am the technologist in the family, and I intend to come up with something to help us communicate. The idea is to have some sort of picture book where she can point to what she wants. My first question: does some sort of assistive technology for people with expressive aphasia already exist? These can be hardware or software devices? If not, then such a software doesn't seem difficult to write. My initial thought is to have an interface with pictures - maybe separated by category (food, shopping) - where she can point to an individual picture to indicate what she needs. We could easily add more items with such a software, and we could have an interface where she (or we) could "flip pages". Which suggests that the best solution would use a touch screen rather than a mouse. It would be really difficult to train her to aim a mouse or find keys on a keyboard. We're thinking of maybe getting a tablet and writing some basic software. But tablets computers are expensive and fragile - I'm not sure if it would be able to stand spills or being knocked about in a nursing home. So my next question: what kind of tablet-like devices are out there which I can program on? I don't know anything about hardware, but if there is something then we could special-order it. What would be safe and durable for such a project? We could do something on an iPod or cell phone, but I feel like that interface would be too small. Finally, does anyone here have experience with this kind of assistive technology? Things I might not anticipate when designing such a system? edit I've added a (pretty hefty!) bounty. I'd kinda like to open this question up to any suggestions, comments, and experiences that people might have. This is a pretty real and important project, so while we will (are working on) a solution, any insights would be particularly helpful. Right now the plan is to mount a screen in her room. We'll either teach her to use a trackball or use a touch-screen panel, after seeing what she is able to use with a simple prototype. Then software akin to an old "hypercard" deck: ---------------------------------------------------------------- | -------------- -------------- | | | Clothes | | Food | ... | | -------------- -------------- | | | | Pic of item 1 Pic of item 2 Pic of item 3 | | | | | | | | | | Pic of item 4 Pic of item 5 Pic of item 6 | | | | | | | | | | <-Back Next-> | ---------------------------------------------------------------- commentcommentcomment!

    Read the article

  • Integrating virtual keyboard on a HP TouchSmart with an Adobe AIR app

    - by Alan
    Hi, Does anyone know if it's possible to integrate the ToushSmart's virtual keyboard with an Adobe AIR application? In most programs (Internet Explorer, Firefox, etc), when a user touches a text field a little keyboard icon automatically pops up which, when pressed, will bring up the virtual keyboard. However, this doesn't happen when clicking on text input fields in Adobe AIR applications. Has anyone had any experience working with AIR/Flash and touchscreens? Is there any API that can tell Windows (or the HP virtual keyboard specifically) that the user has clicked in a text field and that the virtual keyboard should be shown? The text fields are the standard kind (fl.controls.TextInput). Any suggestions would be greatly appreciated. Thanks in advance!

    Read the article

  • Can iPad/iPhone Touch Points be Wrong Due to Calibration?

    - by Kristopher Johnson
    I have an iPad application that uses the whole screen (that is, UIStatusBarHidden is set true in the Info.plist file). The main window's frame is set to (0, 0, 768, 1024), as is the main view in that frame. The main view has multitouch enabled. The view has code to handle touches: - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { for (UITouch *touch in touches) { CGPoint location = [touch locationInView:nil]; NSLog(@"touchesMoved at location %@", NSStringFromCGPoint(location)); } } When I run the app in the simulator, it works pretty much as expected. As I move the mouse from one edge of the screen to the other, reported X values go from 0 to 767. Reported Y values go from 20 to 1023, but it is a known issue that the simulator doesn't report touches in the top 20 pixels of the screen, even when there is no status bar. Here's what's weird: When I run the app on an actual iPad, the X values go from 0 to 767 as expected, but reported Y values go from -6 to 1017. The fact that it seems to work properly on the simulator leads me to suspect that real devices' touchscreens are not perfectly calibrated, and mine is simply reporting values six pixels too low. Can anyone verify that this is the case? Otherwise, is there anything else that could account for the Y values being six pixels off from what I expect? (In a few days, I should have a second iPad, so I can test this with another device and compare the results.)

    Read the article

  • Detect if touch device

    - by Kilnr
    Hello, I'm writing a MIDlet using the Kuix UI toolkit, and I want to make changes to the toolkit depending on whether the current device is a touch screen device. (These changes include making buttons bigger, for easier tapping.) Is there a way to detect whether the device has a touch screen using J2ME (MIDP 2)? [edit] as a (crappy) workaround I check for the screen height instead. A screen width a height of higher than 240 is likely a touch screen... Please let me know if there are any more effective ways.

    Read the article

  • Touch Screen Running Windows CE

    - by Jed
    I'm starting my first project that runs on a 7 inch touch screen running Windows CE 6.0 (and NETCF 3.5). The touch screen doesn't respond to touch too well when I use my finger. The only way for me to navigate around is by using a stylus (or similar). Since I've never worked with Windows CE or a resistive touch screen, I'm not sure if I should expect to be able to use my finger or if the stylus method is, essentially, the only way to effectively navigate around. - or, maybe, I have a touch screen that simply isn't that good. If you have experience with WinCE running on a touch screen, do you find that a stylus is the only way to go?

    Read the article

  • Recognize active objects with a capacitive touch screen display

    - by lucgian84
    I'm trying to develop an app that can recognize an active object (for example: a memory) that touch the smartphone display. Before I start to develop I've to know if there's any objects that my touch screen display can recognize? Which device can be recognizable by a smartphone display? I'm interested to know that for iPhone or for Android phone. I found this app and you can see that with a card I can interact with a mobile device, now I'm asking you if anyone know how to do this kind of app with an iPhone or with an Android phone. Does anyone knows how to do that? There's a library (iOS or Android) to recognize object that I put over the display?

    Read the article

  • Get the co-ordinates of a touch event on Android

    - by Joe
    Hi, I'm new to Android, I've followed the hello world tutorial through and have a basic idea of what's going on. I'm particularly interested in the touch screen of my T-Mobile Pulse so just to get me started I want to be able to write the co-ordinates of a tocuh event on the screen, so say the user touched the co-ordinate 5,2 - a textview on the screen would display that. At present I have a simple program that just loads an xml file which contains the textview I intend to write the co-ordinates in. Thank you in advance, I did Google for help and searched stackoverflow but everything I found either went way over my head or wasn't suitable for this. Cheers.

    Read the article

  • android webView loading finished but it was a blank only I touch the screen can the content show Idont know why?and how it happened

    - by Sunday
    when my webView load this page , it was blank-page or white page only I touch the screen the content can only show private WebView webview; private ProgressDialog mProgressDialog; private Context mContext; public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_web); mContext = this; webview = (WebView)findViewById(R.id.myWebView); String url = (String)getIntent().getExtras().get("url"); webview.getSettings().setJavaScriptEnabled(true); webview.setWebViewClient(new MyWebViewClient()); if(url!=null){ webview.loadUrl(url); } } class MyWebViewClient extends WebViewClient { @Override public void onPageStarted(WebView view, String url, Bitmap favicon) { super.onPageStarted(view, url, favicon); mProgressDialog = ProgressDialog.show(mContext, "tips", "wate···the view is loading", true, false); } @Override public void onPageFinished(WebView view, String url) { mProgressDialog.dismiss(); super.onPageFinished(view, url); } }

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8  | Next Page >