Search Results

Search found 571 results on 23 pages for 'hedley finger'.

Page 18/23 | < Previous Page | 14 15 16 17 18 19 20 21 22 23  | Next Page >

  • Booting Windows from different partition than system

    - by szamil
    I have bought an SSD disk, but my laptop (Dell Precision M6300) refuse to use it as a target disk for windows (AHCPI on/off, BIOS up-to-date). I can't exchange the disk unfortunately... But fortunately, I've managed to install windows using USB disk case. The problem is, that when I put that disk as my internal drive it can't boot. (Disk read error, Three Finger Salute ... ) So I tried with Linux (openSUSE), I manage to install it as well, but when I tried to boot GRUB from internal drive I get errors again. (Should I try GRUB2?) I figured out that I can boot into that internal hard drive's openSUSE system using small USB drive with GRUB, kernel and image on it. So, I just run GRUB from USB drive, it loads necessary stuff from the USB drive and then continues from the internal drive. I want to do the same with Windows. But GRUB (rootnoverify and chainloader +1) does not boot my windows on internal drive. The question: is there any chance to copy the critical windows' boot files into the USB drive, to make it possible to boot from that USB drive, but continue booting from internal (different in general) drive? The USB drive would became a system hardware key! ;-) Disk: Plextor M5S 128GB Sata III, laptop has Sata II, but it's compatible anyway, right?

    Read the article

  • Can't login after upgrading to Windows 8.1

    - by flatline
    This afternoon, I upgraded my work laptop from Windows 8 to Windows 8.1. I had previously had a local account, but after the upgrade, it prompted me to enter my windows account credentials, which I had set up beforehand at some point. I entered my password and clicked next, went through another screen or two, grew tired with the process, and clicked whatever the equivalent button to "skip this step" that I was presented with. Now I can't log in. Not with my (previous) local account password, and not with my windows account password. It's a Dell with biometric identification, which I had set up previously, so I put my finger on the reader and it complained that I couldn't use that fingerprint because I had changed my password. But, I hadn't wittingly changed my password at all. I assume that what happened is that, by entering my credentials, my local account was tied to the Windows account, but because I cancelled the process partway through, something went wrong and I cannot log in. A few questions: 1) How do I log in with my windows account credentials? Should LOCALMACHINENAME\username, which was my previous login method, still work for the Windows account? When I booted to safemode it prompted me with WindowsAccount\myemailaddress, which allowed me to login there, but the regular login doesn't accept the '@' symbol. 2) Is there any way to make that account local-only again? I can't find any way of doing it. 3) I managed to enable the local administrator account and get back into the box; failing all else, is there a quick way to migrate my old profile over to a new user?

    Read the article

  • Android: Autoscrolling HorizontalScrollView

    - by DroidIn.net
    I'm using the following code to simulate tabs and since there are more tabs that width can accommodate user can scroll left or right to make a tab button visible. It all works, however I also provide user with ability to fling between tabs by swiping finger left or right on the tab contents. Again - it works. But when I fling to the rightmost tab its corresponding button is barely visible. I want to autoscroll table inside the HorizontalScrollView so the selected tab button will be visible but when I execute HorizontalScrollView.smoothScrollTo(300, 0) nothing happens. It doen't matter how high I set first x parameter nothing will ever move (yes I do have an algorithm to calculate exact position). Here's XML code for scrolling tab buttons <HorizontalScrollView android:layout_width="fill_parent" android:background="@color/tabs_header" android:layout_height="55dip" android:scrollbars="none" android:id="@+id/tabsButtonView"> <TableLayout android:id="@+id/TableLayout01" android:layout_width="fill_parent" android:layout_height="fill_parent"> <TableRow android:id="@+id/TableRow01" android:layout_width="fill_parent" android:layout_weight="1" android:layout_height="0dip" android:paddingTop="5dip" android:paddingLeft="3dip"> <ImageButton android:src="@drawable/linkup_logo_small" android:id="@+id/tabBtt0" android:layout_width="wrap_content" android:layout_marginLeft="2dip" android:layout_marginRight="2dip" android:layout_height="fill_parent" android:padding="5dip" android:background="@drawable/tab_selected"></ImageButton> <ImageButton android:src="@drawable/simplyhired_small" android:id="@+id/tabBtt1" android:layout_height="fill_parent" android:layout_width="fill_parent" android:layout_marginLeft="2dip" android:layout_marginRight="2dip" android:padding="5dip" android:background="@drawable/tab_normal"></ImageButton> <ImageButton android:src="@drawable/indeedcom_small" android:id="@+id/tabBtt2" android:layout_width="fill_parent" android:layout_height="fill_parent" android:padding="5dip" android:layout_marginLeft="2dip" android:layout_marginRight="2dip" android:background="@drawable/tab_normal"></ImageButton> <ImageButton android:src="@drawable/careerbuilder_logo_small" android:id="@+id/tabBtt3" android:layout_width="fill_parent" android:layout_height="fill_parent" android:padding="5dip" android:layout_marginLeft="2dip" android:layout_marginRight="2dip" android:background="@drawable/tab_normal"></ImageButton> </TableRow> </TableLayout> </HorizontalScrollView>

    Read the article

  • android: AbsListView.OnScrollListener SCROLL_STATE_IDLE is not called after SCROLL_STATE_TOUCH_SCROL

    - by Francesco
    I have a problem with android version 2.1. It looks like a bug. I attached an OnScrollListener to my listView. I'm using the method onScrollStateChanged(AbsListView view, int scrollState) for monitoring the scroll's state of my listview. The scrollstate could assume 3 value (taken from the documentation): SCROLL_STATE_FLING: The user had previously been scrolling using touch and had performed a fling. The animation is now coasting to a stop SCROLL_STATE_IDLE:The view is not scrolling. Note navigating the list using the trackball counts as being in the idle state since these transitions are not animated. SCROLL_STATE_TOUCH_SCROLL:The user is scrolling using touch, and their finger is still on the screen I assume that the SCROLL_STATE_IDLE will always be passed after one of other two states. It's always true excepted for android version 2.1. SCROLL_STATE_IDLE is not passed after SCROLL_STATE_TOUCH_SCROLL The problem happens also if you stop the fling by a touch instead of let the scroll stop by itself. This strange behaviour leaves my listView in an unconsistate state. Someonelse has the same problem? Suggestion for a "not-so-dirty" work around?

    Read the article

  • UIView drawRect: when you draw a line, the rect area will be clear so the previous drawing is gone

    - by snakewa
    It is quite hard to tell so I upload an image to show my problem: http://i42.tinypic.com/2eezamo.jpg Basically in drawRect, I will draw the line from touchesMoved as finger touches and I will call "needsDisplayInRect" for redraw. But I found that the first line is done, the second line will clear the rect part, so some previouse drawing is gone. Here is my implementation: enter code here -(void) drawRect:(CGRect)rect{ //[super drawRect: rect]; CGContextRef context = UIGraphicsGetCurrentContext(); [self drawSquiggle:squiggle at:rect inContext:context]; } - (void)drawSquiggle:(Squiggle *)squiggle at:(CGRect) rect inContext:(CGContextRef)context { CGContextSetBlendMode(context, kCGBlendModeMultiply); UIColor *squiggleColor = squiggle.strokeColor; // get squiggle's color CGColorRef colorRef = [squiggleColor CGColor]; // get the CGColor CGContextSetStrokeColorWithColor(context, colorRef); NSMutableArray *points = [squiggle points]; // get points from squiggle // retrieve the NSValue object and store the value in firstPoint CGPoint firstPoint; // declare a CGPoint [[points objectAtIndex:0] getValue:&firstPoint]; // move to the point CGContextMoveToPoint(context, firstPoint.x, firstPoint.y); // draw a line from each point to the next in order for (int i = 1; i < [points count]; i++) { NSValue *value = [points objectAtIndex:i]; // get the next value CGPoint point; // declare a new point [value getValue:&point]; // store the value in point // draw a line to the new point CGContextAddLineToPoint(context, point.x, point.y); } // end for CGContextStrokePath(context); }

    Read the article

  • Help: Android paint/canvas issue; drawing smooth curves

    - by Wrapper
    How do I get smooth curves instead of dots or circles, when I draw with my finger on the touch screen, in Android? I am using the following code- public class DrawView extends View implements OnTouchListener { private static final String TAG = "DrawView"; List<Point> points = new ArrayList<Point>(); Paint paint = new Paint(); public DrawView(Context context) { super(context); setFocusable(true); setFocusableInTouchMode(true); this.setOnTouchListener(this); paint.setColor(Color.WHITE); paint.setAntiAlias(true); } @Override public void onDraw(Canvas canvas) { for (Point point : points) { canvas.drawCircle(point.x, point.y, 5, paint); // Log.d(TAG, "Painting: "+point); } } public boolean onTouch(View view, MotionEvent event) { // if(event.getAction() != MotionEvent.ACTION_DOWN) // return super.onTouchEvent(event); Point point = new Point(); point.x = event.getX(); point.y = event.getY(); points.add(point); invalidate(); Log.d(TAG, "point: " + point); return true; } } class Point { float x, y; @Override public String toString() { return x + ", " + y; } }

    Read the article

  • Selection Highlight in NSCollectionView

    - by Hooligancat
    On some occasions my head just hurts from banging it against the Cocoa wall. Today is one of those days. I have a working NSCollectionView with one minor, but critical, exception. Getting and highlighting the selected item within the collection. I've had all this working prior to Snow Leopard, but something appears to have changed and I can't quite place my finger on it, so I took my NSCollectionView right back to a basic test and followed Apple's documentation for creating an NSCollectionView here: http://developer.apple.com/mac/library/DOCUMENTATION/Cocoa/Conceptual/CollectionViews/Introduction/Introduction.html The collection view works fine following the quick start guide. However, this guide doesn't discuss selection other than "There are such features as incorporating image views, setting objects as selectable or not selectable and changing colors if they are selected". Using this as an example I went to the next step of binding the Array Controller to the NSCollectionView with the controller key selectionIndexes, thinking that this would bind any selection I make between the NSCollectionView and the array controller and thus firing off a KVO notification. I also set the NSCollectionView to be selectable in IB. There appears to be no selection delegate for NSCollectionView and unlike most Cocoa UI views, there appears to be no default selected highlight. So my problem really comes down to a related issue, but two distinct questions. How do I capture a selection of an item? How do I show a highlight of an item? NSCollectionView's programming guides seem to be few and far between and most searches via Google appear to pull up pre-Snow Leopard implementations, or use the view in a separate XIB file. For the latter (separate XIB file for the view), I don't see why this should be a pre-requisite otherwise I would have suspected that Apple would not have included the view in the same bundle as the collection view item. I know this is going to be a "can't see the wood for the trees" issue - so I'm prepared for the "doh!" moment. As usual, any and all help much appreciated.

    Read the article

  • Flex HTMLLoader component not raising mouseDown events for all mouse clicks

    - by shane
    I have built a Air 2/Flex 4 kiosk application with Flash Builder 4. Currently I am implementing a touch screen browser to enable users to navigate company training videos. In an attempt to improve the usability of the website on the touchscreen I have placed the HTML component in an adaption of Doug McCune's DragScrollingCanvas (updated to use the flex 4 'Scroller' component) to allow users to scroll the webpage by dragging their finger across the screen. The mouseDown event is used to start scrolling the viewport. In addition the webpage was modified to disable text selection with the following css: html { -webkit-user-select: none; cursor: default; } The problem I face is that the HTMLLoader component only fires a mouseDown if a link/input/button on the webpage is clicked, not when the background or any text is clicked. In addition if I remove the custom css the mouseDown event will not fire when text is being selected, but will if previously highlighted text is clicked. As an alternative I also tried adding a group container with the same dimensions as the HTMLLoader to detect the mouseDown events (so that the group container and HTMLLoader have the same Dragable parent container) and was able to capture mouseDown events and scroll the viewport as expected. However as the mouse event is handled by the group container, I am now unable to navigate the webpage. Does anybody know why the HTMLLoader component is not raising mouseDown events for all mouse clicks?

    Read the article

  • .NET Fingerprint SDK

    - by Kishore A
    I am looking for a comparison of Fingerprint reader SDKs available for .NET (we are using .Net 3.5). My requirements are 1. Associate more than one fingerprint with a person. 2. Store the fingerprints it self. (So I do not have to change the Database for my program.) 3. Work in both event and no-event mode. (Event Mode: Give notification if someone swipes a finger on the reader; No-Event mode: I activate the reader in synchronous mode). 4. Should provide API for either confirming a user or Identifying a user. (Confirm API: I send the person's ID/unique number and it confirms that it is the same person; Identifying API: The sdk sends the person's ID after it looks up the person using the fingerprint) I would also like to get a comparison of Fingerprint readers if anybody knows of one available on the internet. Hope I was clear. Thanks, Kishore.

    Read the article

  • OpenGL-ES: Change (multiply) color when using color arrays?

    - by arberg
    Following the ideas in OpenGL ES iPhone - drawing anti aliased lines, I am trying to draw stroked anti-aliased lines and I am successful so far. After line is draw by the finger, I wish to fade the path, that is I need to change the opacity (color) of the entire path. I have computed a large array of vertex positions, vertex colors, texture coordinates, and indices and then I give these to opengl but I would like reduce the opacity of all the drawn triangles without having to change each of the color coordinates. Normally I would use glColor4f(r,g,b,a) before calling drawElements, but it has no effect due to the color array. I am working on Android, but I believe it shouldn't make the big difference, as long as it is OpenGL-ES 1.1 (or 1.0). I have the following code : gl.glEnable(GL10.GL_BLEND); gl.glBlendFunc(GL10.GL_ONE, GL10.GL_ONE_MINUS_SRC_ALPHA); gl.glEnableClientState(GL10.GL_COLOR_ARRAY); gl.glShadeModel(GL10.GL_SMOOTH); gl.glEnableClientState(GL10.GL_VERTEX_ARRAY); gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY); gl.glEnable(GL10.GL_TEXTURE_2D); // Should set rgb to greyish, and alpha to half-transparent, the greyish is // just there to make the question more general its the alpha i'm interested in gl.glColor4f(.75f, .75f, .75f, 0.5f); gl.glVertexPointer(mVertexSize, GL10.GL_FLOAT, 0, mVertexBuffer); gl.glColorPointer(4, GL10.GL_FLOAT, 0, mColorBuffer); gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, mTexCoordBuffer); gl.glDrawElements(GL10.GL_TRIANGLES, indexCount, GL10.GL_UNSIGNED_SHORT, mIndexBuffer.position(startIndex)); If I disable the color array gl.glEnableClientState(GL10.GL_COLOR_ARRAY);, then the glColor4f works, if I enable the color array it does nothing. Is there any way in OpenGl-ES to change the coloring without changing all the color coordinates? I think that in OpenGl one might use a fragment shader, but it seems OpenGL does not have a fragment shader (not that I know how to use one).

    Read the article

  • Multiple touch problem using UITouch/UIView in iphone

    - by John Qualis
    Hi, I am trying to implement a 2-finger "pinch" and "expand" (or enlarge) on a UIView using iPhone 3G and SDK 3.1.2. I haven't done programmed in UITouch/UIEvent before, so I appreciate any guidance to the problem I am facing. When I touch the screen with 2 fingers I see that sometimes "touchesBegan" gives me only 1 event. I need UITouch event count == 2 to measure the distance between 2 fingers. Hence I have to lift up my fingers and repeat this process again till I get both fingers to be detected. Is there a way around? Can I improve it? The actual resize works correctly once both fingers are detected but this issue seen at the start needs to be resolved. I know SDK 3.2 detects gestures such as pinch but I was wondering how it can be done in 3.1.2? The code is given below. The "OnFingerDown" function gets called randomly even when I put 2 fingers down whereas I want "onTwoFingersDown" to be called each time. Moreover the I get only 1 touch event each time I put my fingers down. I mean if "OnFingerDown" was called twice I could somehow get it to work. Thanks in advance. Appreciate any help. John - (void) touchesBegan: (NSSet*) touches withEvent: (UIEvent*) event { UITouch* touch, *touch1; CGPoint location, location1; if([[event allTouches] count] == 1) { touch=[UI[[event allTouches] allObjects] objectAtIndex:0]; location=[touch locationInView:self]; OnFingerDown(location); } else if([[event allTouches] count] == 2) { touch=[[[event allTouches] allObjects] objectAtIndex:0]; location=[touch locationInView:self]; touch1=[[[event allTouches] allObjects] objectAtIndex:1]; location1=[touch1 locationInView:self]; OnTwoFingersDown(location, location1); } }

    Read the article

  • AntFarm anti-pattern -- strategies to avoid, antidotes to help heal from

    - by alchemical
    I'm working on a 10 page web site with a database back-end. There are 500+ objects in use, trying to implement the MVP pattern in ASP.Net. I'm tracing the code-execution from a single-page, my finger has been on F-11 in Visual Studio for about 40 minutes, there seems to be no end, possibly 1000+ method calls for one web page! If it was just 50 objects that would be one thing, however, code execution snakes through all these objects just like millions of ants frantically woring in their giant dirt mound house, riddled with object tunnels. Hence, a new anti-pattern is born : AntFarm. AntFarm is also known as "OO-Madnes", "OO-Fever", OO-ADD, or simply design-pattern junkie. This is not the first time I've seen this, nor my associates at other companies. It seems that this style is being actively propogated, or in any case is a misunderstanding of the numerous OO/DP gospels going around... I'd like to introduce an anti-pattern to the anti-pattern: GST or "Get Stuff Done" AKA "Get Sh** done" AKA GRD (GetRDone). This pattern focused on just what it says, getting stuff done, in a simple way. I may try to outline it more in a later post, or please share your ideas on this antidote pattern. Anyway, I'm in the midst of a great example of AntFarm anti-pattern as I write (as a bonus, there is no documentation or comments). Please share you thoughts on how this anti-pattern has become so prevelant, how we can avoid it, and how can one undo or deal with this pattern in a live system one must work with!

    Read the article

  • UIScrollView only works if the child views aren't hit

    - by dny238
    I have a scroll view that doesn't scroll right, I've simplified the code for below. It draws the view and some horizontal buttons that i add more stuff to in the real code. If you drag the whitespace between the buttons the view scrolls. If you happen to put your finger on a button, it won't scroll. After a related suggestion, I tried to add the delaysContentTouches = YES line, but it doesn't seem to make a difference. http://stackoverflow.com/questions/650437/iphone-uiscrollview-with-uibuttons-how-to-recreate-springboard What am I doing wrong? TIA, Rob Updated the code - (void)viewDidLoad { l = [self landscapeView]; [self.view addSubview:l]; [l release]; } - (UIScrollView *) landscapeView { // LANDSCAPE VIEW UIScrollView *landscapeView = [[UIScrollView alloc] initWithFrame:CGRectMake(0, 0, 320, 325)]; landscapeView.backgroundColor = [UIColor whiteColor]; landscapeView.delaysContentTouches = YES; NSInteger iMargin, runningY, n; iMargin = 3; runningY = iMargin; for (n = 1; n <= 38; n++) { //add day labels UIButton *templabel = [[UIButton alloc] initWithFrame:CGRectMake(iMargin,runningY,320 - ( 2 * iMargin),20)]; templabel.backgroundColor = [UIColor grayColor]; [landscapeView addSubview:templabel]; [templabel release]; runningY = runningY + 30; } landscapeView.contentSize = CGSizeMake( 320, runningY); return landscapeView; }

    Read the article

  • AndEngine VS Android's Canvas VS OpenGLES - For rendering a 2D indoor vector map

    - by Orchestrator
    This is a big issue for me I'm trying to figure out for a long time already. I'm working on an application that should include a 2D vector indoor map in it. The map will be drawn out from an .svg file that will specify all the data of the lines, curved lines (path) and rectangles that should be drawn. My main requirement from the map are Support touch events to detect where exactly the finger is touching. Great image quality especially when considering the drawings of curved and diagonal lines (anti-aliasing) Optional but very nice to have - Built in ability to zoom, pan and rotate. So far I tried AndEngine and Android's canvas. With AndEngine I had troubles with implementing anti-aliasing for rendering smooth diagonal lines or drawing curved lines, and as far as I understand, this is not an easy thing to implement in AndEngine. Though I have to mention that AndEngine's ability to zoom in and pan with the camera instead of modifying the objects on the screen was really nice to have. I also had some little experience with the built in Android's Canvas, mainly with viewing simple bitmaps, but I'm not sure if it supports all of these things, and especially if it would provide smooth results. Last but no least, there's the option of just plain OpenGLES 1 or 2, that as far as I understand, with enough work should be able to support all the features I require. However it seems like something that would be hard to implement. And I've never programmed in OpenGL or anything like it, but I'm willing very much to learn. To sum it all up, I need a platform that would provide me with the ability to do the 3 things I mentioned before, but also very important - To allow me to implement this feature as fast as possible. Any kind of answer or suggestion would be very much welcomed as I'm very eager to solve this problem! Thanks!

    Read the article

  • NSUndoManager grouping problem?

    - by anonymous
    I'm working on a barebones drawing app. I'm attempting to implement undo/redo capability, so I tell the view's undoManager to save the current image before updating the display. This works perfectly (yes, I understand that redrawing/saving the entire view is not incredibly efficient, but to solve this problem before attempting to optimize the code). However, as expected, when I 'undo' or 'redo', only the minute change is reflected. My goal is to have the whole finger stroke undone/redone. To do that, I told the undoManager to [beginUndoGrouping] in the [touchesBegan] method, and to [endUndoGrouping] in [touchesEnded]. That works for a bit, but after drawing a few strokes, the app crashes, and gdb exits with exc_bad_access. I'm very grateful for any insight you can give me. - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { mouseDragged = YES; currentPoint = [[touches anyObject] locationInView:self]; UIGraphicsBeginImageContext(drawingImageView.bounds.size); [drawingImageView.image drawInRect:drawingImageView.bounds]; CGContextRef ctx = UIGraphicsGetCurrentContext(); CGContextSetLineCap(ctx, kCGLineCapRound); CGContextSetLineWidth(ctx, drawingWidth); [drawingColor setStroke]; CGContextBeginPath(ctx); CGContextMoveToPoint(ctx, previousPoint.x, previousPoint.y); CGContextAddLineToPoint(ctx, currentPoint.x, currentPoint.y); CGContextStrokePath(ctx); [self.undoManager registerUndoWithTarget:drawingImageView selector:@selector(setImage:) object:drawingImageView.image]; drawingImageView.image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); previousPoint = currentPoint; }

    Read the article

  • Problem with UIScrollView

    - by leon
    Hi, Sorry for long winded post. I am trying to understand UIScrollView and running into very simple problem. I am creating a scroll view I am making this view 1.5 size larger then normal size Using UIScrollView I expect to see some edge elements of view out of bounds, but should be able to pan the view therefore bringing missing elements back to the visible area. However I am seeing that I can't just pan/scroll view anyway I want, instead view always wants to scroll up, as soon as move away my finger from the screen (touch end event). I am not handling any touches, etc - I just want to understand why does not scaled view stay put where I scroll it? CGRect viewFrame = self.view.frame ; viewFrame.size.width *= 1.5; viewFrame.size.height *= 1.5; CGSize mySize = viewFrame.size; [ ((UIScrollView *) self.view) setContentSize: mySize]; self.view.transform = CGAffineTransformMakeScale(1.5, 1.5); What I really trying to accomplish is something similar to Number on iPad (the same code will work on iPhone): There is a view with lots of controls on it (order entry form) User can zoom into the entire form so all elements look bigger user can pan the form therefore bringing various elements into the visible area of the screen. It seems that UIScrollView can should be able to handle zoom and pan actions (for now I am using Affine Transform to zoom in to the order entry form and iPad) Thanks

    Read the article

  • touchesBegin & touchesMove Xcode Obj C Question

    - by AndrewDK
    So I have my app working good when you press and drag along. I also have UIButtons set to Touch Down in Interface Builder. As well when you drag you need to drag from the outside of the UIButton. You cannot click on the UIButton and drag to the other. TOUCHES MOVED: Code: -(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event touchesForView:self.view] anyObject]; CGPoint location = [touch locationInView:touch.view]; if(CGRectContainsPoint(oneButton.frame, location)) { if (!oneButton.isHighlighted){ [self oneFunction]; [oneButton setHighlighted:YES]; } }else { [oneButton setHighlighted:NO]; } // if(CGRectContainsPoint(twoButton.frame, location)) { if (!twoButton.isHighlighted){ [self twoFunction]; [twoButton setHighlighted:YES]; } }else { [twoButton setHighlighted:NO]; } } TOUCHES BEGAN: Code: - (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event touchesForView:self.view] anyObject]; CGPoint location = [touch locationInView:touch.view]; if(CGRectContainsPoint(oneButton.frame, location)) { [self oneFunction]; [oneButton setHighlighted:YES]; } if(CGRectContainsPoint(twoButton.frame, location)) { [self twoFunction]; [twoButton setHighlighted:YES]; } } I want to be able to click on any of the button fire the function & also be able to drag from one button on to the other and fire that function. So basically just being able to click on a button and slide your finger over and activate the other button without having to press and slide from outside of the button. I think I'm close, need a bit of help. Hope thats clear enough. Thanks.

    Read the article

  • How to scale a sprite image without losing color key information?

    - by Michael P
    Hello everyone, I'm currently developing a simple application that displays map and draws some markers on it. I'm developing for Windows Mobile, so I decided to use DirectDraw and Imaging interfaces to make the application fast and pretty. The map moves when user moves finger on the touchscreen, so the whole map moving/scrolling animation has to be fast, but it is not. On every map update I have to draw portion of the map, control buttons, and markers - buttons and markers are preloaded on DirectDraw surface as a mipmap. So the only thing I do is BitBlit from the mipmap to a back buffer, and from the back buffer to a primary surface (I can't use page flipping due to the windowed mode of my application). Previously I used premultiplied-alpha surface with 32 bit ARGB pixel format for images mipmap, everything was looking good, but drawing entire "scene" was horribly slow - i could forget about smooth map scrolling. Now I'm using mipmap with native (RGB565) pixel format and fuchsia (0xFF00FF) color key. Drawing is much better my mipmap surface is generated on program loading - images are loaded from files, scaled (with filtering) and drawn on mipmap. The problem is, that image scaling process blends pixel colors, and those pixels which are on the border of a sprite region are blended with surrounding fuchsia pixels resulting semi-fuchsia color that is not treated as color key. When I do blitting with color key option, sprites have small fuchsia-like borders, and it looks really bad. How to solve this problem? I can use alpha blitting, but it is too slow - even in ARGB 1555 format.

    Read the article

  • Shared Server Dreamhost

    - by Jseb
    I am trying to install an app on a shared server. If i understand properly because i am using a shared server, and that Dreamhost doesn't suppose rails 3.2.8 I must use FCGI, although i am not sure how to install and to make it run properly. From this tutorial http://wiki.dreamhost.com/Rails_3. To my understand here what I did, In dreamhost, activate PHP 5.x.x FastCGI and made sure Phusion Passenger is unchecked Create an app on my localmachine Because rails doesn't create a dispatch and access file i create the two following file in my /public folder dispatch.fcgi #!/home/username/.rvm/rubies/ruby-1.9.3-p327/bin/ruby ENV['RAILS_ENV'] ||= 'production' ENV['HOME'] ||= `echo ~`.strip ENV['GEM_HOME'] = File.expand_path('~/.rvm/gems/ruby 1.9.3-p327') ENV['GEM_PATH'] = File.expand_path('~/.rvm/gems/ruby 1.9.3-p327') + ":" + File.expand_path('~/.rvm/gems/ruby 1.9.3-p327@global') require 'fcgi' require File.join(File.dirname(__FILE__), '../config/environment') class Rack::PathInfoRewriter def initialize(app) @app = app end def call(env) env.delete('SCRIPT_NAME') parts = env['REQUEST_URI'].split('?') env['PATH_INFO'] = parts[0] env['QUERY_STRING'] = parts[1].to_s @app.call(env) end end Then created the file .htaccess <IfModule mod_fastcgi.c> AddHandler fastcgi-script .fcgi </IfModule> <IfModule mod_fcgid.c> AddHandler fcgid-script .fcgi </IfModule> Options +FollowSymLinks +ExecCGI RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^(.*)$ dispatch.fcgi/$1 [QSA,L] ErrorDocument 500 "Rails application failed to start properly" Uploaded to a folder and pointed to the public folder in dreamhost Made sure dispatch.fcgi has 777 for write ssh and run the following command in the public folder : ./dispatch.fcgi Crossing my finger but it doesn't work I get the following errors ./dispatch.fcgi: line 1: ENV[RAILS_ENV]: command not found ./dispatch.fcgi: line 1: =: command not found ./dispatch.fcgi: line 2: ENV[HOME]: command not found ./dispatch.fcgi: line 2: =: command not found ./dispatch.fcgi: line 3: syntax error near unexpected token (' ./dispatch.fcgi: line 3:ENV['GEM_HOME'] = File.expand_path('~/.rvm/gems/ruby 1.9.3-p327')' Doing wrong??? Oh and if i go on the server i get this Rails application failed to start properly

    Read the article

  • WCF events in server-side

    - by Eisenrich
    Hi all, I'm working on an application in WCF and want to receive events in the server side. I have a web that upon request needs to register a fingerprint. The web page request the connection of the device and then every second for 15 seconds requests the answer. The server-side code is apparently "simple" but doesn't work. Here is it: [ServiceContract] interface IEEtest { [OperationContract] void EEDirectConnect(); } class EETest : IEEtest { public void EEDirectConnect() { CZ ee = new CZ(); // initiates the device dll ee.Connect_Net("192.168.1.200", 4011); ee.OnFinger += new _IEEEvents_OnFingerEventHandler(ee_OnFinger); } public void ee_OnFinger() { //here i have a breakpoint; } } every time I put my finger, it should fire the event. in fact if I static void Main() { EETest pp = new EETest(); pp.EEDirectConnect(); } It works fine. but from my proxy it doesn't fire the event. do you have any tips, recommendations, or can you see the error? Thanks everyone.

    Read the article

  • Add double tap action (presentModalViewController) to UISCOLLVIEW

    - by R.J.
    I have been wrestling this issue for a while now and cannot seem to get the following "touchesEnded" method to execute within a UISCROLLVIEW. I have read on many of the forums that UISCROLLVIEW will take control of all touch events unless it is subclassed, but I cannot seem to get the code right (still new to the SDK). Basically I have a scrollview made uo with several UIIMAGEVIEW's and currenlty have scrolling with paging (much like the photo app). I have been studying the SCROLLING MADDNESS example without success. All I want to do is anywhere in the UISCROLLVIEW have the user double tap to presentModalViewController back to my info page (i.e.) (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { NSSet *allTouches = [event allTouches]; switch ([allTouches count]) { case 1: {// One finger touch UITouch *touch = [[allTouches allObjects] objectAtIndex:0]; if ([touch tapCount] == 2) {InfoButtonViewController *scroll = [[InfoButtonViewController alloc] initWithNibName:nil bundle:nil]; scroll.modalTransitionStyle = UIModalTransitionStyleFlipHorizontal; [self presentModalViewController:scroll animated:YES]; [scroll release]; } } } } Any code assistance would be greatly appreciated. The UISCROLLVIEW is implemented as follows (let me know if I need to provide additional details). Thank you in advance... MyViewController.h @interface MyViewController : UIViewController { } @end

    Read the article

  • jQuery HOW TO?? pass additional parameters to success callback for $.ajax call ?

    - by dotnetgeek
    Hello jQuery Ninjas! I am trying, in vain it seems, to be able to pass additional parameters back to the success callback method that I have created for a successful ajax call. A little background. I have a page with a number of dynamically created textbox / selectbox pairs. Each pair having a dynamically assigned unique name such as name="unique-pair-1_txt-url" and name="unique-pair-1_selectBox" then the second pair has the same but the prefix is different. In an effort to reuse code, I have crafted the callback to take the data and a reference to the selectbox. However when the callback is fired the reference to the selectbox comes back as 'undefined'. I read here that it should be doable. I have even tried taking advantage of the 'context' option but still nothing. Here is the script block that I am trying to use: <script type="text/javascript" language="javascript"> $j = jQuery.noConflict(); function getImages(urlValue, selectBox) { $j.ajax({ type: "GET", url: $j(urlValue).val(), dataType: "jsonp", context: selectBox, success:function(data){ loadImagesInSelect(data, $j(this)) } , error:function (xhr, ajaxOptions, thrownError) { alert(xhr.status); alert(thrownError); } }); } function loadImagesInSelect(data, selectBox) { //var select = $j('[name=single_input.<?cs var:op_unique_name ?>.selImageList]'); var select = selectBox; select.empty(); $j(data).each(function() { var theValue = $j(this)[0]["@value"]; var theId = $j(this)[0]["@name"]; select.append("<option value='" + theId + "'>" + theValue + "</option>"); }); select.children(":first").attr("selected", true); } From what I have read, I feel I am close but I just cant put my finger on the missing link. Please help in your typical ninja stealthy ways. TIA

    Read the article

  • On Windows 7: Same path but Explorer & Java see different files than Powershell

    - by Ukko
    Submitted for your approval, a story about a poor little java process trapped in the twilight zone... Before I throw up my hands and just say that the NTFS partition is borked is there any rational explanation of what I am seeing. I have a file with a path like this C:\Program Files\Company\product\config\file.xml I am reading this file after an upgrade and seeing something wonky. Eclipse and my Java app are still seeing the old version of this file while some other programs see the new version. The test that convinced my it was not my fat finger that caused the problem was this: In Explorer I entered the above path and Explorer displayed the old version of the file. Forcing Explorer to reload via Ctrl-F5 still yields the old version. This is the behavior I get in Java. Now in PowerShell I enter more "C:\Program Files\Company\product\config\file.xml" I cut and past the path from Explorer to make sure I am not screwing anything up and it shows me the new version of the file. So for the programming aspect of this, is there a cache or some system component that would be storing this stale reference. Am I responsible for checking or reseting that for some class of files. I can imagine somebody being "creative" in how xml files are processed to provide some bell or whistle. But it could be a case of just being borked. Any insights appreciated...Thanks!

    Read the article

  • Dragging an UIView inside UIScrollView

    - by Sergey Mikhanov
    Hello community! I am trying to solve a basic problem with drag and drop on iPhone. Here's my setup: I have a UIScrollView which has one large content subview (I'm able to scroll and zoom it) Content subview has several small tiles as subviews that should be dragged around inside it. My UIScrollView subclass has this method: - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event { UIView *tile = [contentView pointInsideTiles:[self convertPoint:point toView:contentView] withEvent:event]; if (tile) { return tile; } else { return [super hitTest:point withEvent:event]; } } Content subview has this method: - (UIView *)pointInsideTiles:(CGPoint)point withEvent:(UIEvent *)event { for (TileView *tile in tiles) { if ([tile pointInside:[self convertPoint:point toView:tile] withEvent:event]) return tile; } return nil; } And tile view has this method: - (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event { UITouch *touch = [touches anyObject]; CGPoint location = [touch locationInView:self.superview]; self.center = location; } This works, but not fully correct: the tile sometimes "falls down" during the drag process. More precisely, it stops receiving touchesMoved: invocations, and scroll view starts scrolling instead. I noticed that this depends on the drag speed: the faster I drag, the quicker the tile "falls". Any ideas on how to keep the tile glued to the dragging finger? Thanks in advance!

    Read the article

  • How to make UIButton work like Launcher in SpringBoard, when pressed for long timeinterval

    - by KayKay
    In my ViewController I am using UIButton which triggers some action when touchedUpInside. Upto here no problem, but i also want to add another action for touchDown event. More precisely i dont have an idea how to add that action to which event, that will work the same as when App-Icon is pressed longer in springboard which causes springboard enter in editing mode.Actually what i want is : when this button is kept holding for ,say about 2 seconds it will pop-up an alertView without touch being heldUp (with finger is still on button). I have tried with 2 NSdate difference, one of which is allocated when touchedDown and other when touchUpInside. Its working and popping alert,but only after touchedUpInside. I want it to show alert without touch being removed. Here is the code. (IBAction)touchedDown:(id)sender { momentTouchedDown = [[NSDate alloc] init];//class variable NSLog(@"touched down"); } - (IBAction)touchUpInside:(id)sender { NSLog(@"touch lifted Up\n"); NSDate *momentLifted = [[NSDate alloc] init]; double timeInterval = [momentLifted timeIntervalSinceDate:momentTouchedDown]; NSLog(@"time lifted = %@, time down = %@, time diff = %@", momentLifted, momentTouchedDown, [NSString stringWithFormat:@"%g",timeInterval]); [momentLifted release]; if(timeInterval > 2.0) { NSLog(@"AlertBox has been fired"); UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Yes" message:@""//msg delegate:self cancelButtonTitle:@"Cancel" otherButtonTitles:@"OK" , nil]; [alert show]; [alert release]; } } Please provide me an insight..thanks for help in advance.

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23  | Next Page >