Search Results

Search found 12685 results on 508 pages for 'apple touch icon'.

Page 18/508 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • Cocoa Touch: Creating and Adding Custom View

    - by Jason
    I create a custom view in cocoa touch that is superclassed by UIView and in my main controller I initialize it and then add it as a subview to the main view, but when I add it to the main view it calls my initializer method again and causes an infinite loop. Am I going about creating my custom view wrong? Here is the mainView (void)loadView { UIImage *tempImage = [UIImage imageNamed: @"image1.jpg"]; CustomImageContainer *testImage = [[CustomImageContainer alloc] initWithImage: tempImage andLabel: @"test image" onTop: true atX: 10 atY: 10]; [self.view addSubview: testImage]; } and the CustomImageContainer -(CustomImageContainer *) initWithImage: (UIImage *)imageToAdd andLabel: (NSString *)text onTop: (BOOL) top atX: (int) x_cord atY: (int) y_cord{ UIImageView *imageview_to_add = [[UIImageView alloc] initWithImage: imageToAdd]; imageview_to_add.frame = CGRectMake(0, 0, imageToAdd.size.width, imageToAdd.size.height); UILabel *label_to_add = [[UILabel alloc] init]; label_to_add.text = text; label_to_add.alpha = 50; label_to_add.backgroundColor = [UIColor blackColor]; label_to_add.textColor = [UIColor whiteColor]; [self addSubview: imageview_to_add]; self.frame = CGRectMake(x_cord, y_cord, imageToAdd.size.width, imageToAdd.size.height); if (top) { label_to_add.frame = CGRectMake(0, 0, imageview_to_add.frame.size.width, imageview_to_add.frame.size.height); //[self addSubview: label_to_add]; } else { label_to_add.frame = CGRectMake(0,.2 * imageview_to_add.frame.size.height, imageview_to_add.frame.size.width, imageview_to_add.frame.size.height); } [self addSubview: label_to_add]; [super init]; return self; }

    Read the article

  • iPhone: detect "touch-and-drag" gesture from UIBarButtonItem?

    - by Greg Maletic
    I have an "add" button that's represented by a UIBarButtonItem. Hitting the "add" button adds an object into a list that represents a moment in time. By default, that time is "now"...but I'd like to be able to use dragging behavior to let the user specify earlier times for the object. Here's the behavior I want to implement: If the user touches on the UIBarButtonItem and lets go quickly, an object is added to the list that represents "now." If the user touches on the UIBarButtonItem and drags, a little UI pops up that shows the time that the distance of their drag represents. The further they drag, the further back in time their touch will represent. When they let go, the object representing an earlier time will get added to the list. (Though the description of the behavior is complicated, I'm convinced this will be pretty intuitive for users of the app.) I haven't implemented code for anything but the most simple touches in the past, and I'm at a loss as to the best way to try this. Does anyone have any suggestions, or could point me towards some sample code that implements something like this? Thanks very much.

    Read the article

  • UIButton title disappears on touch

    - by psychotik
    I want a touchable UILabel, but since that's not possible I decided to create a UIButton of type UIButtonTypeCustom. It looks how I want it to look, but when I touch it, the title disappears. I tried setting the title for state UIControlStateHighlighted but that didn't help. I know I'm doing something really silly here, but can't figure it out. I tried a few things (commented in the code below) which didn't seem to help. Here's how I'm configuring the button: UIButton* button = [UIButton buttonWithType:UIButtonTypeCustom]; button.frame = CGRectMake(0, 0, width, height); button.clearsContextBeforeDrawing = YES; button.contentHorizontalAlignment = UIControlContentHorizontalAlignmentLeft; button.contentVerticalAlignment = UIControlContentVerticalAlignmentCenter; button.enabled = YES; button.hidden = NO; button.highlighted = NO; button.opaque = YES; button.selected = NO; button.userInteractionEnabled = YES; button.showsTouchWhenHighlighted = NO; [button setTitle:@"Hello World" forState:UIControlStateNormal]; button.titleLabel.font = [UIFont boldSystemFontOfSize:16]; button.titleLabel.textColor = [UIColor blackColor]; // Things I tried that didn't help //[button setBackgroundColor:[UIColor clearColor]]; //button.titleLabel.text = @"Hello World"; //[button setTitle:@"Hello World" forState:UIControlStateHighlighted]; Any ideas?

    Read the article

  • Is there a way to make Apple Mail work well with Exchange server calendar?

    - by Joshua Frank
    My office uses Macs, but most of our clients use Windows and Outlook. Whenever people send invitations from Apple Mail to a Windows/Outlook machine, the invitations are garbled and look nothing like the nicely formatted invitations that Outlook people are used to. We also have no tools to view shared calendars, so we can choose mutually open time slots, and other useful calendar features that Outlook has and Apple Mail seems not to. So is there a plugin or third party program that will give Apple Mail the nice calendar features of Outlook? (By the way, I've looked into actually buying Outlook for Mac, and the pricing is kind of prohibitive, because you MUST buy the whole Office Suite, which we already have, there's no upgrade path, and there's no volume discounting.)

    Read the article

  • Apple push notifications could not receive size of message error

    - by embedded
    I'm trying to send some push notification message to my App using the sandbox option. I'm getting those messages on my console: Sun Apr 25 21:56:22 unknown /usr/libexec/notification_proxy[57] : Could not receive size of message Sun Apr 25 21:56:22 unknown /usr/libexec/notification_proxy[57] : Could not receive message How do I resolve this? Thanks

    Read the article

  • Apple Push Notification Service - notification messages aren't sent to iphone device

    - by crazywood
    Hi all, I constructed provider code with using C# and it was able to send notification messages to iphone devices successfully. But since yesterday, it hasn't worked. Also it seems to connect APNS successfully and send notification message. Unfortunately, no notification message is received by iphone device. I controlled internet connection and device token of iphone device. What else can I do? Thanks in advance...

    Read the article

  • Using Apple autorelease pools without Objective-C

    - by PierreBdR
    I am developing an application that needs to work on Linux, Windows and Mac OS X. To that purpose, I am using C++ with Qt. For many reasons, on Mac OS X, I need to use CoreFoundation functions (such as CFBundleCopyBundleURL) that creates core objects that need to be released with CFRelease. But doing so generate a lots of these warnings: *** __NSAutoreleaseNoPool(): Object 0x224f7e0 of class NSURL autoreleased with no pool in place - just leaking All the code I've seen concerning these autorelease pools are written in Objective-C. Does anybody know how to create/use autorelease pools in C or C++?

    Read the article

  • Apple AppStore approval - post iPad?

    - by BahaiResearch.com
    Has the appStore changed it's approval times? Before the iPad announcement we were getting approved in 2-3 days. After the announcement we're over a week and no sign of any approval on the horizon. Are there any changes in the xml plist for the app needed to improve approval times?

    Read the article

  • High-res icon in Windows Vista alt-tab thumbnail preview?

    - by netvope
    I have customized my alt-tab screen with the following: [HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\AltTab] "OverlayIconPx"=dword:00000040 "MaxThumbSizePx"=dword:00000100 "MinThumbSizePcent"=dword:00000064 It works great: the thumbnail becomes 256 pixel wide and the icon at the corner of the thumbnail becomes 64x64 pixels. However, Windows doesn't load the high-res icons from the programs; instead, it uses the 16x16 pixel icon and scaled it up by nearest-neighbor. I'm sure the programs has high-res icons because I saw them with in "Extra Large Icon" view in Explorer. So the question is: How can I force Windows to load the high-res icons for the alt-tab thumbnail preview? (Perhaps a registry key, or a .dll hack/injection?)

    Read the article

  • Would Apple be able to tell if I'm running Mac OS inside a VM?

    - by Thomas Havlik
    Just as the question/title says. I understand that running Mac OS inside of a VM is against the EULA for the consumer version (but not the server, which is much more expensive!) If I were to purchase a legal copy of Mac OS, and install it to a VM, then register as an Apple Developer, would they shut me out? Is there a way they can tell the difference between emulated hardware and Apple computers? I'm slightly unfamiliar with how all of Apple's software works. Windows goes through this "genuine" test whenever installing service packs, but I don't know if Mac goes through the same trouble. Many thanks, -Tom

    Read the article

  • How do Apple code level support requests work?

    - by JustinXXVII
    I'm having an issue with a build that I can't figure out, and I'm considering filing a support request. It says I'll get contacted in about 3 days usually. Can anyone explain the process this takes, because I don't want to burn one of these by screwing up! Should I include a zip file of my entire project, or will the source file I'm having issues with be enough? Do I have to be at my computer when they contact me? Thanks for your help, generous ones!

    Read the article

  • Apple application names

    - by Moshe
    In the provisioning portal, can I "reserve" application names? Once reserved, can I transfer those names to other accounts? (Presumably by renaming the first app and then renaming the second one on the other account to the original name?

    Read the article

  • how to add emblem to windows file icon?

    - by Wifi Cordon
    Hello, I want to know how to add some sort of emblem or badge to a specific file. Just like drop box does to a specific file when it is doing a sync. I found that the guys from drop box are able to do that on linux by using libnautilus package. But I need to do so on windows and haven't been able to find an answer. Will the solution change from one windows version to other? best regards

    Read the article

  • How to use Ubuntu Touch manage-address-books.py?

    - by Rotary Heart
    Well I have been reading the docs for a few days and I found that I can "import" my contacts from a .csv file with the following: Alternatively you can import contacts from a csv file. The csv file should be in same format as /usr/share/demo-assets/contacts-data/data.csv. Replace the sample data.csv file with your own version and run manage-address-books.py create to import your contacts. But I can't figure out how to use manage-address-books.py create could anyone help me? I know that I can use syncevolution, but I want to sync my .csv file too.

    Read the article

  • How To Temporarily Disable The Touch Screen In X1 Carbon

    - by Daniel Cazzulino
    I know, why would anyone want to do that? Scott properly predicted: Don't knock a touchscreen until you've used one. Every laptop should (and will) have a touch screen in a year. Mark my words. And surely, less than a year later, the X1 Carbon (an amazing ultrabook for sure) has a touch model. And as of today, the price difference for the touch screen is a ridiculous $30 (actually $24 with a “back to school” coupon right now ;)): So why would you NOT get it? I know for some it works great. Now, let’s get real about touch *for a developer* for a minute. About 99.9% of my time in front of my laptop I’m either using Visual Studio or Chrome. I have my hands on the keyboard ALL THE TIME. I use the trackpoint ALL THE TIME. If I want to scroll, I only have to slightly move my fingers. I don’t click around much on pages: I READ them. So, in a few months of using the X1, I think I touched the screen like 10 times, and it was mostly to clear dust, which drives whatever app is in focus crazy. Plus, at home I have this simple setup:...Read full article

    Read the article

  • How can I check whether Exposé is being activated or not?

    - by yangumi
    Hi, I'm creating an application that emulates MacBook's multi-touch Trackpad. As you may know, on MacBook's trackpad if you swipe 4 fingers up, it triggers the Show Desktop. if you swipe 4 fingers down, it shows the Exposé. However, if the Show Desktop is being activated and you swipe 4 fingers down, it will come back to the normal mode. The same goes with the Exposé: if the Exposé is being activated and you swipe 4 fingers up, it will also come back to the normal mode. Here is the problem: I use the keyboard shortcut F3 to show the Exposé and F11 to show the Show Desktop. The problem is, when the Show Desktop is being activated, if I press F3, it will go straight to the Exposé. And when the Exposé is being activated, if I press F11 it will go straight to the Show Desktop. But I want it to behave like Trackpad, which I guess its code may look like this - FourFingersDidSwipeUp { if (isExposeBeingActivated() || isShowDesktopBeingActivated()) { pressKey("Esc"); } else { pressKey("F11"); } } But I don't know how to implement the "isExposeBeingActivated()" and "isShowDesktopBeingActivated()" methods. I've tried creating a window and check whether its size has changed (on assumption that if the Expose is being activated, its size should be smaller), but the system always returns the same size. I tried monitoring the background processes during the Expose, but nothing happened. Does anyknow have any suggestions on this? (I'm sorry if my English sounds weird.)

    Read the article

  • Phone crash when try to use vibration on Android

    - by Diego Unanue
    Im developing an app that when you click a button the phone has to vibrate, the issue is that the phone just chashes. Saing that I need permitions to vibrate. I've already set this permition in the build.setting (android manifiest). Here is the code build.settings: settings = { orientation = { default = "portrait", supported = { "portrait", } }, iphone = { plist= { CoronaUseIOS7LandscapeOnlyWorkaround = true, CoronaUseIOS7IPadPhotoPickerLandscapeOnlyWorkaround = true, CoronaUseIOS6LandscapeOnlyWorkaround = true, CoronaUseIOS6IPadPhotoPickerLandscapeOnlyWorkaround = true, UIApplicationExitsOnSuspend = false, UIPrerenderedIcon = true, UIStatusBarHidden = false, CFBundleIconFile = "Icon.png", CFBundleIconFiles = { "Icon.png", "[email protected]", "Icon-60.png", "[email protected]", "Icon-72.png", "[email protected]", "Icon-76.png", "[email protected]", "Icon-Small.png", "[email protected]", "Icon-Small-40.png", "[email protected]", "Icon-Small-50.png", "[email protected]", }, }, }, android = { permissions = { { name = ".permission.C2D_MESSAGE", protectionLevel = "signature" }, }, usesPermissions = { "android.permission.INTERNET", "android.permission.VIBRATE", }, }, } the file that uses the vibration is: local onButtonEvent = function (event ) system.vibrate() end I read all post in Corona page without success. Can I see the android manifest to see if the permissions are there. I've read that is a Corona issue not sure.

    Read the article

  • Windows 7: Touch gestures in IE not working without explorer.exe being run once

    - by Michael
    Details: Internet Explorer 9 and Windows 7 Professional, running on a HP TouchSmart (touch screen PC). It is going to be a kiosk PC (running a custom GUI for displaying websites). Scenario 1: When running Internet Explorer as a normal program in Windows 7, touch functions work perfectly. I can scroll the website by dragging it with my finger, I can pinch zoom and I can touch-and-hold right click. I now change the default shell in Windows to Internet Explorer (ie. IE starts instead of explorer.exe). Internet Explorer of course starts up when logging in. However, touch functions are reduced to basic clicking (no dragging, no pinch zooming, no touch-and-hold right click). Then I manually start explorer.exe, and the touch functions work again! And here is the weird part: When I kill explorer.exe, the touch functions keeps working - even if I close IE and start a new instance. Scenario 2: The exact same, but instead of changing the default shell to Internet Explorer, I change it to my own program, which uses an embedded Internet Explorer ("WebBrowser"). Same thing happens. What I've tried: Autorun programs: When explorer.exe launches, it launches all the autorun programs. There are no relevant programs being run by explorer, but just in case, I have manually started all the autorun programs, so that it is identical (but without explorer.exe) to a normal login. It still does not work (until I launch explorer.exe). Specifically TabTip.exe, TabTip32.exe and wisptis.exe are all running. All services are also started. To sum it up Running explorer.exe once changes something in the touch capabilities of Internet Explorer. It doesn't matter if explorer.exe is running - as long as it has been run once. Does anyone know what causes this behavior? Or how I can circumvent it neatly? Thanks!

    Read the article

  • Does Apple Magic Mouse fully work on Windows 7 x86/x64?

    - by Sorin Sbarnea
    I would like to know if Apple Magic Mouse works on Windows 7 (x86/x64) on non-Apple computers. Here are some checklists: x64 compatibility left click right click middle button? vertical scroll horizontal scroll bind additional gestures to keystrokes? are any usage problems? In case it works please advise on how to get the drivers.

    Read the article

  • How to make web icon open with specific browser?

    - by David
    I have an icon on my desktop for a website called QUAKE LIVE and I use Google Chrome as my default browser. The website isn't compatible with Google Chrome, but it with Mozilla Firefox. Is there any way to edit the properties of the icon to open with Firefox instead of Chrome?

    Read the article

  • Is there any multi-touch graphics tablet with Linux drivers?

    - by Zifre
    After watching the absolutely amazing 10/GUI video, I have been dying to try to implement something like this. I can do the software side quite easily, but I don't have the hardware. The Wacom Bamboo Fun would work, but the Linux drivers don't support the multi-touch features. Microsoft's "UnMouse Pad" looks like the perfect solution, but it is not commercially available yet. Are there any similar devices that would work? Alternatively, is there a way to build a DIY version? (It is fairly easy to build a multi-touch display with a webcam and IR LEDs, but it would not be pressure sensitive. Does anyone have any info on how the UnMouse Pad works and if it is possible to build one?) EDIT: I should clarify that I don't want a multi-touch display. I want the sensor to be separate from the display. If that sounds crazy, watch the 10/GUI video.

    Read the article

  • How do you hit modifier keys when touch typing?

    - by Bob Ueland
    I am a programmer and I want to learn to touch type. As all programmers know, using modifier keys like Control, Command and Alt are essential. When programming I think that every second or third word I use involves a modifier key. But most touch typing learning software do not address these keys, it is as if they do not exist. Not only do they not let you practice them, they do not even tell you which fingers to use to hit them. Actually there is a touch typing game that I use called StarTyper (http://lidenanna.com) that lets you practice modifier keys and even make up your own custom words containing modifier keys. But not even this game tells you which fingers to use when hitting the modifier keys. Has anybody addressed this problem. Or are there just homespun methods that work for one person but not for the other?

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >