Search Results

Search found 16336 results on 654 pages for 'device admin'.

Page 599/654 | < Previous Page | 595 596 597 598 599 600 601 602 603 604 605 606  | Next Page >

  • iPod library song path access

    - by Narendra Kumar
    I studied a lot but did not find any good answer. My problem is i am calculating beats per minute of song.I used Bass api for that, now problem is i am able to get bpm of a file which i have in my resource folder but i have to get bpm of all songs of iPod library. I am getting path of song from MPMediaItemPropertyAssetURL property of MpMediaItem but when passing this one in api api is saying stream cant load BASS_StreamCreateFile(). In my point of view i am not getting right path of song. How can we access valid path? Did any one access ipod library song with external api? Please help me . Thanks CODE IS THIS NSURL *assetURL = [song valueForProperty:MPMediaItemPropertyAssetURL]; NSString *respath = [NSString stringWithFormat:@"%@",[assetURL absoluteString]]; BASS_SetConfig(BASS_CONFIG_IOS_MIXAUDIO, 0); // Disable mixing. To be called before BASS_Init. if (HIWORD(BASS_GetVersion()) != BASSVERSION) { NSLog(@"An incorrect version of BASS was loaded"); } // Initialize default device. if (!BASS_Init(-1, 44100, 0, NULL, NULL)) { //textView.text = [NSString stringWithFormat:@"%@ CAN'T Load Stream",textView.text]; } DWORD chan1; if(!(chan1=BASS_StreamCreateFile(FALSE, [respath UTF8String], 0, 0, BASS_SAMPLE_LOOP))) { NSLog(@"Can't load stream!"); textView.text = [NSString stringWithFormat:@"%@ not loading...",textView.text]; } mainStream=BASS_StreamCreateFile(FALSE, [respath cStringUsingEncoding:NSUTF8StringEncoding], 0, 0, BASS_SAMPLE_FLOAT|BASS_STREAM_PRESCAN|BASS_STREAM_DECODE); float playBackDuration=BASS_ChannelBytes2Seconds(mainStream, BASS_ChannelGetLength(mainStream, BASS_POS_BYTE)); NSLog(@"Play back duration is %f",playBackDuration); HSTREAM bpmStream=BASS_StreamCreateFile(FALSE, [respath UTF8String], 0, 0, BASS_STREAM_PRESCAN|BASS_SAMPLE_FLOAT|BASS_STREAM_DECODE); //BASS_ChannelPlay(bpmStream,FALSE); BpmValue= BASS_FX_BPM_DecodeGet(bpmStream,0.0, playBackDuration, MAKELONG(45,256), BASS_FX_BPM_MULT2| BASS_FX_BPM_MULT2 | BASS_FX_FREESOURCE, (BPMPROCESSPROC*)proc); textView.text = [NSString stringWithFormat:@"%@ %f",textView.text,BpmValue];

    Read the article

  • uiview's controls unresponsive.. or how to foul up a view hierarchy

    - by user293139
    Hello all, I'm working on an app that has two sections, a config section and a results section. My config section needs to be 2 separate views (horizontal and vert, and yes, I can hear the intake of breath from here), with one rotatable view for the results. b/c of layout restraints and a lot of pain around rotation, I'm not using a navigation controller. I've been experiencing the joys of rotation experimentation and have settled upon keeping my views contained as subviews of my view controller. i.e. view controller.view.subviews = configH, configV, and results. I then use the controller.view bringSubviewToFront to bring the either the configH, configV, or the result view to the front. Rotation works-queue(humor intended) the angelic choirs... almost. What's happening is that my configV button's are responsive, but when the device (or simulator) is rotated, my configH controls are not. (configV is the second subview added, but the first one to be brought to the front because app comes up in portrait mode) The controls on the results view also work. Plan B was to assign the controller.view to configH, configV, or results. All of my controls now work, but rotation is now fouled up. Question 1: Is there a better way to do this? (a horizontal and vertical config view and a rotatable results view) Question 2: Does the above suggest a design issue, or is it more likely that my addled brain is just missing something in my own code. (nothing from the peanut gallery please) many thanks!

    Read the article

  • iPhone: Leak with UIWebView loading Office documents. Any ideas how to avoid it?

    - by Thomas Tempelmann
    While there are already quite a few posts about leaks around UIWebView, mine is a bit more special, I believe, and thus deserves its own post here. I see a reproducible large leak every time I load a Office document such as a Word or Excel file. For instance, every time I display a 180KB .doc file, I get a 100KB leak. And that happens with both the simulator and an actual device, running OS 3.1.3. The leak is not visible with the Leaks instrument but only by looking at the malloc instances via the ObjectAlloc instrument. Here's a picture from the instruments trace: I've also made a demo project, UIWebView-Leak.zip, so you can verify this yourself. To see the leak, use the ObjectAlloc instrument, switch to the view where you see individual allocation objects, and sort by size so that you see the large ones in a group, just like in my picture above. Then view a Office document a few times and find the Malloc objects that keep staying "Live" even after the actual UIWebView has been freed. Is this a known bug? Or is there any way I can avoid these leaks? I.e, have you successfully shown Office documents on an iPhone withing getting such leaks? Note: I've reported this as a bug to Apple now, too (ID 7950594) I am still waiting for someone (including Apple) to confirm this as a true leak or show why it isn't (i.e. that I do something wrong or make wrong assumptions)

    Read the article

  • Inheritance issue with ivar on the iPhone

    - by Buffalo
    I am using the BLIP/MYNetwork library to establish a basic tcp socket connection between the iPhone and my computer. So far, the code builds and runs correctly in simulator but deploying to device yields the following error: error: property 'delegate' attempting to use ivar '_delegate' declared in super class of 'TCPConnection' @interface TCPConnection : TCPEndpoint { @private TCPListener *_server; IPAddress *_address; BOOL _isIncoming, _checkedPeerCert; TCPConnectionStatus _status; TCPReader *_reader; TCPWriter *_writer; NSError *_error; NSTimeInterval _openTimeout; } /** The delegate object that will be called when the connection opens, closes or receives messages. */ @property (assign) id<TCPConnectionDelegate> delegate; /** The delegate messages sent by TCPConnection. All methods are optional. */ @protocol TCPConnectionDelegate <NSObject> @optional /** Called after the connection successfully opens. */ - (void) connectionDidOpen: (TCPConnection*)connection; /** Called after the connection fails to open due to an error. */ - (void) connection: (TCPConnection*)connection failedToOpen: (NSError*)error; /** Called when the identity of the peer is known, if using an SSL connection and the SSL settings say to check the peer's certificate. This happens, if at all, after the -connectionDidOpen: call. */ - (BOOL) connection: (TCPConnection*)connection authorizeSSLPeer: (SecCertificateRef)peerCert; /** Called after the connection closes. You can check the connection's error property to see if it was normal or abnormal. */ - (void) connectionDidClose: (TCPConnection*)connection; @end @interface TCPEndpoint : NSObject { NSMutableDictionary *_sslProperties; id _delegate; } - (void) tellDelegate: (SEL)selector withObject: (id)param; @end Does anyone know how I would fix this? Would I simply declare _delegate as a public property of the base class "TCPEndPoint"? Thanks for the help ya'll!

    Read the article

  • Getting Hprof dump for other processes from application code

    - by Natarajan
    Hi, In my application , i have an option to capture the hprof dump. I used android.os.Debug.dumpHprofData (String fileName) Initially i though the hprof data generated by the method above is for the entire device , which is not so . The hprof data generated is only for my process. Now i am trying to generate hprof data for other process as well. I need to get the Hprof dump for all the running processes from application code. from adb shell i tried "kill -10 " , This command will generate the hprof file for the corresponding process in the data/misc folder. Now the problem is this command is working perfectly from the adb shell prompt , but i am not able to embed the command to mycode. My code is like Runtime.getRuntime().exec("chmod 777 /data/misc") Runtime.getRunTime().exec("kill -10 ") No exceptions are thrown , but somehow it is not working. The same code above is capturing Hprof dump for my process, when i give my process ID. I tried with "android.os.Process.sendSignal (int pid, android.os.Process.SIGNAL_USR1) ;" also.Getting the same problem.It is capturing Hprof dump for my process. For other processes it is not working. Do we need to have any special permission to kill other process from our process ? Or is it a built issue ? can you please suggest some possible way to get Hprof dump for other processes from application code? Thanks

    Read the article

  • Webcam video stream processing.

    - by vikramtheone
    Hi Guys, I'm working with an image processing project, my final goal is to detect features on a real time video and finally track those features. I will be working with an Embedded Processor Platform called Freescale's i.MX515, it is a 32-bit media processor running on Ubuntu 9.04. Right now I'm working on the algorithms to locate the features, so, I'm using still images. When I'm satisfied with the results I will have to start using a video stream and I don't want to make use of a video file as a source stream, because then I will have to worry about video decoders then. Instead I would like to plug in a USB Wecam to the embedded platform (It has USB ports on it), directly take the frames as they are captured and send it to my application. I will take care to buy a webcam which will be supported in Linux (Device driver). But my question is will I be able to capture the incoming video stream from the webcam and send it to my application? Will I be able to configure the webcam and DMA to write the incoming frames in a particular memory location whose pointer I can simply pass to my application? (Confused!!!) I hope I could convey my doubts, can anyone guide me with what steps that I have to take to achieve all of these easily? Do you foresee any impossibility here? Help!!! Regards Vikram

    Read the article

  • Determining the color of a pixel in a bitmap using C# in a WPF app

    - by DanM
    The only way I found so far is System.Drawing.Bitmap.GetPixel(), but Microsoft has warnings for System.Drawing that are making me wonder if this is the "old way" to do it. Are there any alternatives? Here's what Microsoft says about the System.Drawing namespace. I also noticed that the System.Drawing assembly was not automatically added to the references when I created a new WPF project. System.Drawing Namespace The System.Drawing namespace provides access to GDI+ basic graphics functionality. More advanced functionality is provided in the System.Drawing.Drawing2D, System.Drawing.Imaging, and System.Drawing.Text namespaces. The Graphics class provides methods for drawing to the display device. Classes such as Rectangle and Point encapsulate GDI+ primitives. The Pen class is used to draw lines and curves, while classes derived from the abstract class Brush are used to fill the interiors of shapes. Caution Classes within the System.Drawing namespace are not supported for use within a Windows or ASP.NET service. Attempting to use these classes from within one of these application types may produce unexpected problems, such as diminished service performance and run-time exceptions. - http://msdn.microsoft.com/en-us/library/system.drawing.aspx

    Read the article

  • Bilinear interpolation - DirectX vs. GDI+

    - by holtavolt
    I have a C# app for which I've written GDI+ code that uses Bitmap/TextureBrush rendering to present 2D images, which can have various image processing functions applied. This code is a new path in an application that mimics existing DX9 code, and they share a common library to perform all vector and matrix (e.g. ViewToWorld/WorldToView) operations. My test bed consists of DX9 output images that I compare against the output of the new GDI+ code. A simple test case that renders to a viewport that matches the Bitmap dimensions (i.e. no zoom or pan) does match pixel-perfect (no binary diff) - but as soon as the image is zoomed up (magnified), I get very minor differences in 5-10% of the pixels. The magnitude of the difference is 1 (occasionally 2)/256. I suspect this is due to interpolation differences. Question: For a DX9 ortho projection (and identity world space), with a camera perpendicular and centered on a textured quad, is it reasonable to expect DirectX.Direct3D.TextureFilter.Linear to generate identical output to a GDI+ TextureBrush filled rectangle/polygon when using the System.Drawing.Drawing2D.InterpolationMode.Bilinear setting? For this (magnification) case, the DX9 code is using this (MinFilter,MipFilter set similarly): Device.SetSamplerState(0, SamplerStageStates.MagFilter, (int)TextureFilter.Linear); and the GDI+ path is using: g.InterpolationMode = InterpolationMode.Bilinear; I thought that "Bilinear Interpolation" was a fairly specific filter definition, but then I noticed that there is another option in GDI+ for "HighQualityBilinear" (which I've tried, with no difference - which makes sense given the description of "added prefiltering for shrinking") Followup Question: Is it reasonable to expect pixel-perfect output matching between DirectX and GDI+ (assuming all external coordinates passed in are equal)? If not, why not? Finally, there are a number of other APIs I could be using (Direct2D, WPF, GDI, etc.) - and this question generally applies to comparing the output of "equivalent" bilinear interpolated output images across any two of these. Thanks!

    Read the article

  • How do I get the WVGA Android browser to stop scaling my images?

    - by Dan Fabulich
    I'm designing an HTML page for display in Android browsers. Consider this simple example page: <html> <head><title>Simple!</title> </head> <body> <p><img src="http://sstatic.net/so/img/logo.png"></p> </body> </html> It looks just fine on the standard HVGA phones (320x480), but on HDPI WVGA sizes (480x800 or 480x854) the built-in browser automatically scales the image up; it looks ugly. I've read that I should be able to use this tag to force the browser to stop scaling my page: <meta name="viewport" content="width=device-width; initial-scale=1.0; maximum-scale=1.0; minimum-scale=1.0; user-scalable=0;" /> ... but all that does is disable user scaling (the zoom buttons disappear); it doesn't actually prevent the browser from scaling my image. Adjusting the scale factors (setting them all to 2.0 or 0.5) has no effect at all. How can I force the WVGA browser to stop scaling my images?

    Read the article

  • How to convert Unicode strings (\u00e2, etc) into NSString for display?

    - by karlbecker_com
    I am trying to support arbitrary unicode from a variety of international users. They have already put a bunch of data into sqlite databases on their iPhones, and now I want to capture the data into a database, then send it back to their device. Right now I am using a php page that is sending data back to from an internet mysql database. The data is saved in the mysql database properly, but when it's sent back it comes out as unicode text, such as Frank\u00e2\u0080\u0099s iPad instead of just Frank's iPad where the apostrophe should really be a curly apostrophe. The answer posted to another question indicates that there is no built-in Cocoa methods to convert the "\u00e2\u0080\u0099" portion of the unicode string from the webserver to an NSString object. Is this correct? That seems really surprising (and scarily disappointing), since Cocoa definitely allows input from many different Unicode characters, and I need to support any arbitrary language that I have never heard of, and all of the possible characters. I save them to and from the local sqlite database just fine now, but once I send it to a web server, then perhaps pull down different data, I want to ensure the data pulled from the web server is correctly formatted.

    Read the article

  • Streaming Youtube Videos

    - by Vinay
    Hi All, I am writing an application to play the youtube videos using streaming. First method: I am getting the RTSP URL to the video using GData APIs. Here is the code to play the RTSP url. VideoView mVideoView = new VideoView(this); setContentView(mVideoView); mVideoView.setVideoURI(Uri.parse("rtsp://rtsp2.youtube.com/CiILENy73wIaGQkDwpjrUxOWQBMYESARFEgGUgZ2aWRlb3MM/0/0/0/video.3gp")); mVideoView.start(); But it throws error on both G1 device and emulator (Emulator has some firewall problem as per mailing list) Here is the error message ERROR/PlayerDriver(35): Command PLAYER_INIT completed with an error or info PVMFFailure Second method: A hack way to get the path of 3gp file from http://www.youtube.com/get_video?v=&t=<&<.. After getting the file path and I can call setVideoURI and it plays fine. But it is a hack way to achieve the requirement. I have checked the Youtube App also, it also does the hack way to play the youtube url.(Checked with logcat) I have tried changing from VideoView to MediaPlayer but no change in the error. Is there a "Clean" way to do this? Please let me know your thoughts.

    Read the article

  • Having trouble with confusing behaviour of error between debug and release modes in Xcode

    - by Cocorico
    Hi guys, I am confused over something (what is new!). I have an iPhone program I am writing and using some sqlite in a certain method, and there is some error which is giving me a message that says "Program received signal: “EXC_BAD_ACCESS” Okay, so I am trying to hunt down why this is doing this, and I notice something: When I run the program in debug mode, it gives me this error every single time I access this method (I test on the device). However when I run the program in release mode, I can access this method 2 times, and then it will give me this error the third time. So I mean, can someone just give me an explanation of what might cause this, I think that maybe deep-down I am not that smart on the difference in XCode of debug and release modes. I think that release mode does optimizing, and I guess the actual assembly machine code comes out different, yes? I AM A BIG NEWBIE USER UNFORTUNATELY! I am not clear on a lot of things like this, or whether it needs for I to remove nslog commands in the release build and such. Maybe I should just post the actual code in separate Stack OVerflow post, and see if people can see the error, then maybe this all become clear to me.

    Read the article

  • Android Video Layout and backbutton to activity

    - by Marcjc
    I have an application where you can click on a button, this takes you to a new activity with four new buttons, listen, bio, ringtone, and watch. My watch button kicks off the following code: Button cmd_watchme = (Button)this.findViewById(R.id.watch); cmd_watchme.setOnClickListener(new View.OnClickListener() { public void onClick(View view) { setContentView(R.layout.tvvideo); VideoView video=(VideoView)findViewById(R.id.VideoView); MediaController mediaController = new MediaController(andy.this); mediaController.setAnchorView(video); video.setMediaController(mediaController); video.setVideoURI(videopath); video.start(); } }); After the video is displayed I am trying to get the backbutton on the phone itself to take the user back to the four button selection activity, listen, bio, ringtone, and watch. Question is, how do i do this? I was figuring if there was a way to change the ContentView after the video is displayed back to the main one for the four button page but could not figure it out. When I press the backbutton on the device, it takes me two levels up to the main selection activity not the four button activity. I hope this was somewhat clear. Thanks for any help.

    Read the article

  • Tablet interface for the physically disabled?

    - by Glenn
    My sister has Cerebral Palsy, which in her case means she has only gross motor control and her speech is slurred. Implications should be obvious: traditional computer/phone/tablet interfaces won't work for her and she can't speak clearly enough for speech recognition software to help her at all. She enjoys reading but has difficulty holding the book and/or turning a page. There are a few options for helping her use a computer, but nothing for tablets or eReaders. That's where you come in. I would like to make (or buy, if such a thing exists) a better interface to an Android tablet that would work for someone with little to no physical dexterity or speech ability. I'd also be interested in work on the Kindle or iPad, but I'm most familiar with Android so I'm starting there. I know Android has Bluetooth capability. Is it possible to interface a joystick to control the Android device? By "control", I mean the entire operating system - selecting an app, launching it, controlling the menus, etc. I want to give her control over the whole thing, not just a specific app. On a PC this can be accomplished by creating a generic USB HID interface and an arcade joystick to move the mouse over the screen and click on thigns. Is it possible to do something like that in Android? Any help you can offer would be greatly appreciated. Thanks!

    Read the article

  • Getting auth token for dropbox account from accountmanager in android

    - by user1490880
    I am trying to get auth token for a dropbox account configured in device from account manager. I am using accountManager.getAuthToken(account, "DROPBOX",null,Hello.this, new GetAuthTokenCallback(), null);//account" is dropbox account I am seeing a Allow/Deny page. I click on Allow, but the callback is not getting invoked at all and i dont get the auth token. I got the authtoken for a google account with this(with a different authtokentype). What i am missing. I am not sure about the authTokenType parameter for dropbox. Also are there any other parameter specific for dropbox like the bundle parameter that i am missing. Is this way possible for dropbox? Check below for the function parameters public AccountManagerFuture<Bundle> getAuthToken (Account account, String authTokenType, Bundle options, Activity activity, AccountManagerCallback<Bundle> callback, Handler handler) Link: http://developer.android.com/reference/android/accounts/AccountManager.html UPDATE I assume since we are able to create a dropbox account in android Accounts and Sync(Settings), there must be a dropbox authenticator that has all the functions in AbstractAccountAuthenticator implemented including getAuthToken(). So dropbox should support giving auth token i think. Also dropbox uses oauth1, whereas account manager uses outh 2.0. So is this an issue.Can anyone comment on this?

    Read the article

  • Android SDK: hello world does not run

    - by Alex
    I have installed Java x64, Eclipse Classic Judo x64 + ADT Pluggin. OS win 7 x64. I did installation everything according to the manual. Then created first application and launched it. Emulator was launched but hello world was not. I have not idea what doing wrong. Do anyone knows of such error and my problem as a whole? thx Console log: [2012-10-06 13:35:42 - test] ------------------------------ [2012-10-06 13:35:42 - test] Android Launch! [2012-10-06 13:35:42 - test] adb is running normally. [2012-10-06 13:35:42 - test] Performing com.example.test.MainActivity activity launch [2012-10-06 13:35:42 - test] Automatic Target Mode: launching new emulator with compatible AVD 'AVD_41' [2012-10-06 13:35:42 - test] Launching a new emulator with Virtual Device 'AVD_41' [2012-10-06 13:35:42 - Emulator] Failed to create Context 0x3005 [2012-10-06 13:35:42 - Emulator] emulator: WARNING: Could not initialize OpenglES emulation, using software renderer. [2012-10-06 13:35:42 - Emulator] WARNING: Data partition already in use. Changes will not persist! [2012-10-06 13:35:42 - Emulator] WARNING: SD Card image already in use: C:\Users\Zewisa\.android\avd\AVD_41.avd/sdcard.img [2012-10-06 13:35:42 - Emulator] WARNING: Cache partition already in use. Changes will not persist! [2012-10-06 13:35:42 - Emulator] could not get wglGetExtensionsStringARB [2012-10-06 13:35:42 - Emulator] could not get wglGetExtensionsStringARB [2012-10-06 13:35:42 - Emulator] could not get wglGetExtensionsStringARB [2012-10-06 13:35:42 - Emulator] could not get wglGetExtensionsStringARB [2012-10-06 13:35:42 - Emulator] could not get wglGetExtensionsStringARB [2012-10-06 13:35:42 - Emulator] could not get wglGetExtensionsStringARB [2012-10-06 13:35:42 - Emulator] could not get wglGetExtensionsStringARB [2012-10-06 13:35:42 - Emulator] could not get wglGetExtensionsStringARB [2012-10-06 13:35:42 - Emulator] emulator: warning: opening audio input failed [2012-10-06 13:35:42 - Emulator]

    Read the article

  • android: problem sending mail, SuperNotCalledException thrown

    - by MobileDev123
    Hi, While sending mail from a button's click my device shows an error. Which throws SuperNotCalledException in logcat I am posting the code and the logcat output here. Code Intent emailIntent = new Intent( android.content.Intent.ACTION_SEND); emailIntent.setType("text/plain"); String recosubject = getString(R.string.recoSub); emailIntent.putExtra(android.content.Intent.EXTRA_SUBJECT, "" + recosubject); emailIntent.putExtra(android.content.Intent.EXTRA_TEXT, "" + reco); startActivity(Intent.createChooser(intent, "Send mail...")); And here is the logcat output Uncaught handler: thread main exiting due to uncaught exception 12-29 15:29:14.488: ERROR/AndroidRuntime(6823): android.app.SuperNotCalledException: Activity {android/com.android.internal.app.ChooserActivity} did not call through to super.onCreate() 12-29 15:29:14.488: ERROR/AndroidRuntime(6823): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2461) 12-29 15:29:14.488: ERROR/AndroidRuntime(6823): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2512) 12-29 15:29:14.488: ERROR/AndroidRuntime(6823): at android.app.ActivityThread.access$2200(ActivityThread.java:119) 12-29 15:29:14.488: ERROR/AndroidRuntime(6823): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1863) 12-29 15:29:14.488: ERROR/AndroidRuntime(6823): at android.os.Handler.dispatchMessage(Handler.java:99) 12-29 15:29:14.488: ERROR/AndroidRuntime(6823): at android.os.Looper.loop(Looper.java:123) 12-29 15:29:14.488: ERROR/AndroidRuntime(6823): at android.app.ActivityThread.main(ActivityThread.java:4367) 12-29 15:29:14.488: ERROR/AndroidRuntime(6823): at java.lang.reflect.Method.invokeNative(Native Method) 12-29 15:29:14.488: ERROR/AndroidRuntime(6823): at java.lang.reflect.Method.invoke(Method.java:521) 12-29 15:29:14.488: ERROR/AndroidRuntime(6823): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:860) 12-29 15:29:14.488: ERROR/AndroidRuntime(6823): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:618) 12-29 15:29:14.488: ERROR/AndroidRuntime(6823): at dalvik.system.NativeStart.main(Native Method) Can anybody tell what is going wrong in this code?

    Read the article

  • AudioRecord problems with non-HTC devices

    - by Marc
    I'm having troubles using AudioRecord. An example using some of the code derived from the splmeter project: private static final int FREQUENCY = 8000; private static final int CHANNEL = AudioFormat.CHANNEL_CONFIGURATION_MONO; private static final int ENCODING = AudioFormat.ENCODING_PCM_16BIT; private int BUFFSIZE = 50; private AudioRecord recordInstance = null; ... android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO); recordInstance = new AudioRecord(MediaRecorder.AudioSource.MIC, FREQUENCY, CHANNEL, ENCODING, 8000); recordInstance.startRecording(); short[] tempBuffer = new short[BUFFSIZE]; int retval = 0; while (this.isRunning) { for (int i = 0; i < BUFFSIZE - 1; i++) { tempBuffer[i] = 0; } retval = recordInstance.read(tempBuffer, 0, BUFFSIZE); ... // process the data } This works on the HTC Dream and the HTC Magic perfectly without any log warnings/errors, but causes problems on the emulators and Nexus One device. On the Nexus one, it simply never returns useful data. I cannot provide any other useful information as I'm having a remote friend do the testing. On the emulators (Android 1.5, 2.1 and 2.2), I get weird errors from the AudioFlinger and Buffer overflows with the AudioRecordThread. I also get a major slowdown in UI responsiveness (even though the recording takes place in a separate thread than the UI). Is there something apparent that I'm doing incorrectly? Do I have to do anything special for the Nexus One hardware?

    Read the article

  • 3.1.3 and 3.2 different behaviour

    - by teo
    I'm using a custom cell in tableView with a UITextField - (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath { static NSString *CellIdentifier = @"Cell"; UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:CellIdentifier]; if (cell == nil) { cell = [[[UITableViewCell alloc] initWithStyle:UITableViewCellStyleSubtitle reuseIdentifier:CellIdentifier] autorelease]; UITextField *txtField = [[UITextField alloc] initWithFrame:CGRectMake(0, 0, 280, 24)]; txtField.placeholder = @"<Enter Text>"; txtField.textAlignment = UITextAlignmentLeft; txtField.clearButtonMode = UITextFieldViewModeAlways; txtField.autocapitalizationType = UITextAutocapitalizationTypeNone; txtField.autocorrectionType = UITextAutocorrectionTypeNo; [cell.contentView addSubview:txtField]; [txtField release]; } } This works fine and the UITextField covers the cell. When i run this with 3.2 sdk or on the iPad the UITextField isn't aligned properly to the left, overlapping the cell and i have to use a UITextField width of 270 instead of 280 UITextField *txtField = [[UITextField alloc] initWithFrame:CGRectMake(0, 0, 270, 24)]; It seems something is wrong the pixel ratio. How can this be fixed? Is there a way to determine the version of os the device has ( 3.1.2, 3.1.3, 3.2 or maybe even the 4.0) or can it be done another way? Thank you Teo

    Read the article

  • Setting White balance and Exposure mode for iphone camera + enum default

    - by Spectravideo328
    I am using the back camera of an iphone4 and doing the standard and lengthy process of creating an AVCaptureSession and adding to it an AVCaptureDevice. Before attaching the AvCaptureDeviceInput of that camera to the session, I am testing my understanding of white balance and exposure, so I am trying this: [self.theCaptureDevice lockForConfiguration:nil]; [self.theCaptureDevice setWhiteBalanceMode:AVCaptureWhiteBalanceModeLocked]; [self.theCaptureDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure]; [self.theCaptureDevice unlockForConfiguration]; 1- Given that the various options for white balance mode are in an enum, I would have thought that the default is always zero since the enum Typedef variable was never assigned a value. I am finding out, if I breakpoint and po the values in the debugger, that the default white balance mode is actually set to 2. Unfortunately, the header files of AVCaptureDevice does not say what the default are for the different camera setting. 2- This might sound silly, but can I assume that once I stop the app, that all settings for whitebalance, exposure mode, will go back to their default. So that if I start another app right after, the camera device is not somehow stuck on those "hardware settings".

    Read the article

  • SIGABRT on iPhone when changing xib

    - by Boz
    I've just finished off an app for the iPhone which, until today, ran fine on the iPhone simulator and actual devices. I tried changing the xib which is loaded in the applicationDidFinishLaunching method in my application delegate class - all I did was change the string in initWithNibName. When I launch the app on the simulator, the Default.png image is shown, then the app crashes with an uncaught exception. When running on a device, the Default.png image is shown for about 10 seconds, the UI is never loaded and I get 'GDB: Program received signal: "SIGABRT".' on the Xcode status bar. Debugging shows that applicationDidFinishLaunching is never actually reached before the app crashes. Setting the starting xib back to the original solves the issue, but now I've made a change and saved it in the Interface Builder and the app shows the same issues as above - I've made no code changes at all. Is this a memory issue, or a known issue of a common mistake? NOTE: I've made no code changes whatsoever, and the only changes I've made to the xib are cosmetic, the IBOutlets are all intact.

    Read the article

  • CGContextDrawImage returning bad access

    - by Marcelo
    Hello guys, I've been trying to blend two UIImage for about 2 days now and I've been getting some BAD_ACCESS errors. First of all, I have two images that have the same orientation, basically I'm using the CoreGraphics to do the blending. One curious detail, everytime I modify the code, the first time I compile and run it on device, I get to do everything I want without any sort of trouble. Once I restart the application, I get error and the program shuts down. Can anyone give me a light? I tried accessing the baseImage sizes dynamically, but it gives me bad access too. Here's a snippet of how I'm doing the blending. UIGraphicsBeginImageContext(CGSizeMake(320, 480)); CGContextRef context = UIGraphicsGetCurrentContext(); CGContextTranslateCTM(context, 0, 480); CGContextScaleCTM(context, 1.0, -1.0); CGContextDrawImage(context, rect, [baseImage CGImage]); CGContextSetBlendMode(context, kCGBlendModeOverlay); CGContextDrawImage(context, rect, [tmpImage CGImage]); [transformationView setImage:UIGraphicsGetImageFromCurrentImageContext()]; UIGraphicsEndImageContext();

    Read the article

  • Providing lat/long AND title in iOS Maps URL seems to cause zoom level to be ignored

    - by Ian Howson
    I'm writing an iOS app that shows a location in Maps upon a user action. I'd like to drop a pin with a description and zoom in to show map detail. If I invoke Maps with the url http://maps.google.com/maps?q=-33.895851,151.18483+(Some+Description)&z=19 I get a pin with 'Some Description', but the zoom level is ignored. This does work on the Google Maps website. If I use http://maps.google.com/maps?q=-33.895851,151.18483&z=19 the zoom works, but I get no pin. I've tried a few combinations of ?q=, ?ll= and ?sll=, but so far, nothing will change zoom and show a description. Any clues? Just so we're really clear, here are some screenshots. I want this to work on a real device (i.e. with iOS Maps). The simulator uses Google Maps through Safari. Here's what I see with URL 1 ([[UIApplication sharedApplication] openURL:[NSURL URLWithString: @"http://maps.google.com/maps?q=-33.895851,151.18483+(Some+Description)&z=19"]];) This is URL 2 ([[UIApplication sharedApplication] openURL:[NSURL URLWithString: @"http://maps.google.com/maps?q=-33.895851,151.18483&z=19"]];): This is relikd's suggestion ([[UIApplication sharedApplication] openURL:[NSURL URLWithString: @"http://maps.google.com/maps?z=19&q=-33.895851,151.18483+(Some+Description)"]];): I want the image I see in screenshot 2, but with a pin and a description.

    Read the article

  • how to edit a text message ( sms ) as it arrives in windows mobile 6 using managed code

    - by x86shadow
    I want to make an application to be installed on two pocket PCs and send Encrypted text messages and receive and decrypt them on the other device. I already made an application that gets special text messages ( starting with !farenc! ) and I know how to Encrypt/Decrypt the messages as well but I don't know how to edit a text message as it's arrive ( for decryption ). please help. thanks in advance and sorry for my bad English using System; using System.Linq; using System.Collections.Generic; using System.ComponentModel; using System.Data; using System.Drawing; using System.Text; using System.Windows.Forms; using Microsoft.WindowsMobile; using Microsoft.WindowsMobile.PocketOutlook; using Microsoft.WindowsMobile.PocketOutlook.MessageInterception; namespace SMSDECRYPT { public partial class Form1 : Form { MessageInterceptor _SMSCatcher = new MessageInterceptor(InterceptionAction.Notify, true); MessageCondition _SMSFilter = new MessageCondition(); public Form1() { InitializeComponent(); _SMSFilter.Property = MessageProperty.Body; _SMSFilter.ComparisonType = MessagePropertyComparisonType.StartsWith; _SMSFilter.CaseSensitive = true; _SMSFilter.ComparisonValue = "!farenc!"; _SMSCatcher.MessageCondition = _SMSFilter; _SMSCatcher.MessageReceived += new MessageInterceptorEventHandler(_SMSCatcher_MessageReceived); } private void Form1_Load(object sender, EventArgs e) { //... } void _SMSCatcher_MessageReceived(object sender, MessageInterceptorEventArgs e) { SmsMessage mySMS = (SmsMessage)e.Message; string sms = mySMS.Body.ToString(); sms = sms.Substring(8); //Decryption //... //Update the received message and replace it with the decrypted text //!!!HELP!!! } } }

    Read the article

  • Altering an embedded truetype font so it will be useable by Windows GDI

    - by Ritsaert Hornstra
    I am trying to render PDF content to a GDI device context (a 24bit bitmap to be exact). Parsing the PDF stream into PDF objects and rendering the PDF commands from the content dictionary works well, inclduing font rendering. Embedded fonts are decompressed from their FontFile streams and "loaded" using AddFontMemResourceEx. Now some embedded fonts remove some TrueType tables that are needed by GDI, like the NAME table. Because of this, I tried to modify the font by parsing the TrueType subset font into it's tables and modify those tables that have data missing / missing tables are regenerated with as correct information as possible. I use the Microsoft Font Validator tool to see how "correct" the generated font is. I still get a few errors, like for the maxp table the max values are usually too large (it is a subset) or The xAvgCharWidth field does not equal the calculated value of the OS/2 table is not correct but this does not stop other embedded fonts to be useable.The fonts embedded using PDFCreator are the ones that are problematic. Question: - How can I determine what I need to change to the font file in order for GDI to be able to use it? - Are there any other font validation tools that might give me insight into what is still wrong with the fontfile? If needed: I can make an original fontfile and an altered fontfile available for download somewhere.

    Read the article

< Previous Page | 595 596 597 598 599 600 601 602 603 604 605 606  | Next Page >