Search Results

Search found 11448 results on 458 pages for 'video camera'.

Page 128/458 | < Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >

  • MediaRecorder prepare() causes segfault

    - by dwilde1
    Folks, I have a situation where my MediaRecorder instance causes a segfault. I'm working with a HTC Hero, Android 1.5+APIs. I've tried all variations, including 3gpp and H.263 and reducing the video resolution to 320x240. What am I missing? The state machine causes 4 MediaPlayer beeps and then turns on the video camera. Here's the pertinent source: UPDATE: ADDING SURFACE CREATE INFO I have rebooted the device based on previous answer to similar question. UPDATE 2: I seem to be following the MediaRecorder state machine perfectly, and if I trap out the MR code, the blank surface displays perfectly and everything else functions perfectly. I can record videos manually and play back via MediaPlayer in my code, so there should be nothing wrong with the underlying code. I've copied sample code on the surface and surfaceHolder code. I've looked at the MR instance in the Debug perspective in Eclipse and see that all (known) variables seem to be instantiated correctly. The setter calls are all now implemented in the exaxct order specced in the state diagram. UPDATE 3: I've tried all permission combinations: CAMERA + RECORD_AUDIO+RECORD_VIDEO, CAMERA only, RECORD_AUDIO+RECORD_VIDEO This is driving me bats! :))) // in activity class definition protected MediaPlayer mPlayer; protected MediaRecorder mRecorder; protected boolean inCapture = false; protected int phaseCapture = 0; protected int durCapturePhase = INF; protected SurfaceView surface; protected SurfaceHolder surfaceHolder; // in onCreate() // panelPreview is an empty LinearLayout surface = new SurfaceView(getApplicationContext()); surfaceHolder = surface.getHolder(); surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); panelPreview.addView(surface); // in timer handler runnable if (mRecorder == null) mRecorder = new MediaRecorder(); mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); mRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA); mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); mRecorder.setOutputFile(path + "/" + vlip); mRecorder.setVideoSize(320, 240); mRecorder.setVideoFrameRate(15); mRecorder.setPreviewDisplay(surfaceHolder.getSurface()); panelPreview.setVisibility(LinearLayout.VISIBLE); mRecorder.prepare(); mRecorder.start(); Here is a complete log trace for the process run and crash: I/ActivityManager( 80): Start proc com.ejf.convince.jenplus for activity com.ejf.convince.jenplus/.JenPLUS: pid=17738 uid=10075 gids={1006, 3003} I/jdwp (17738): received file descriptor 10 from ADB W/System.err(17738): Can't dispatch DDM chunk 46454154: no handler defined W/System.err(17738): Can't dispatch DDM chunk 4d505251: no handler defined I/WindowManager( 80): Screen status=true, current orientation=-1, SensorEnabled=false I/WindowManager( 80): needSensorRunningLp, mCurrentAppOrientation =-1 I/WindowManager( 80): Enabling listeners W/ActivityThread(17738): Application com.ejf.convince.jenplus is waiting for the debugger on port 8100... I/System.out(17738): Sending WAIT chunk I/dalvikvm(17738): Debugger is active I/AlertDialog( 80): [onCreate] auto launch SIP. I/WindowManager( 80): onOrientationChanged, rotation changed to 0 I/System.out(17738): Debugger has connected I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): debugger has settled (1370) I/ActivityManager( 80): Displayed activity com.ejf.convince.jenplus/.JenPLUS: 5186 ms I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/AudioHardwareMSM72XX( 2696): AUDIO_START: start kernel pcm_out driver. W/AudioFlinger( 2696): write blocked for 96 msecs I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 W/AuthorDriver( 2696): Intended width(640) exceeds the max allowed width(352). Max width is used instead. W/AuthorDriver( 2696): Intended height(480) exceeds the max allowed height(288). Max height is used instead. I/AudioHardwareMSM72XX( 2696): AudioHardware pcm playback is going to standby. I/DEBUG (16094): *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** I/DEBUG (16094): Build fingerprint: 'sprint/htc_heroc/heroc/heroc: 1.5/CUPCAKE/85027:user/release-keys' I/DEBUG (16094): pid: 17738, tid: 17738 com.ejf.convince.jenplus Thanks in advance! -- Don Wilde http://www.ConvinceProject.com

    Read the article

  • H.264 in Firefox

    - by illuminatedtiger
    Hi all, As I understand it Firefox does not support H.264 encoded video using the tag. I've been told that Flash will quite happily handle such content however I have no experience with Flash nor do I have access to Adobe Creative Suite. I'm developing primarily for Firefox users and recoding our video content to OGG would not be practical. What are my options?

    Read the article

  • ipad video format [closed]

    - by Mike
    When you use iTunes to sync your videos with the iPhone the videos are always saved with no more than 640 pixels wide, if I am not wrong. What about the iPad? What is the size of videos iTunes syncs with iPad? 1024x768? and what if the video has a dimension below 1024x768? Will it scale up? or will it keep the video at low res and scale when you play? thanks.

    Read the article

  • Starting to make progress Was [MediaRecorder prepare() causes segfault]

    - by dwilde1
    Folks, I have a situation where my MediaRecorder instance causes a segfault. I'm working with a HTC Hero, Android 1.5+APIs. I've tried all variations, including 3gpp and H.263 and reducing the video resolution to 320x240. What am I missing? The state machine causes 4 MediaPlayer beeps and then turns on the video camera. Here's the pertinent source: UPDATE: ADDING SURFACE CREATE INFO I have rebooted the device based on previous answer to similar question. UPDATE 2: I seem to be following the MediaRecorder state machine perfectly, and if I trap out the MR code, the blank surface displays perfectly and everything else functions perfectly. I can record videos manually and play back via MediaPlayer in my code, so there should be nothing wrong with the underlying code. I've copied sample code on the surface and surfaceHolder code. I've looked at the MR instance in the Debug perspective in Eclipse and see that all (known) variables seem to be instantiated correctly. The setter calls are all now implemented in the exaxct order specced in the state diagram. UPDATE 3: I've tried all permission combinations: CAMERA + RECORD_AUDIO+RECORD_VIDEO, CAMERA only, RECORD_AUDIO+RECORD_VIDEO This is driving me bats! :))) UPDATE 4: starting to work... but with puzzling results. Based on info in bug #5050, I spaced everything out. I have now gotten the recorder to actually save a snippet of video (a whole 2160 bytes!), and I did it by spacing the view visibility, prepare() and start() w.a.a.a.a.a.y out (like several hundred milliseconds for each step). I think what happens is that either bringing the surface VISIBLE has delayed processing or else the start() steps on the prepare() operation before it is complete. What is now happening, however, is that my simple timer tickdown counter is getting clobbered. Is it now that the preview and save operations are causing my main process thread to become unavailable? I'm recording only 10fps at 176x144. Referencing the above code, I've added a timer tickdown after setPreviewDisplay(), prepare() and start(). As I say, it now functions to some degree, but the results still have anomalies. // in activity class definition protected MediaPlayer mPlayer; protected MediaRecorder mRecorder; protected boolean inCapture = false; protected int phaseCapture = 0; protected int durCapturePhase = INF; protected SurfaceView surface; protected SurfaceHolder surfaceHolder; // in onCreate() // panelPreview is an empty LinearLayout surface = new SurfaceView(getApplicationContext()); surfaceHolder = surface.getHolder(); surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); panelPreview.addView(surface); // in timer handler runnable if (mRecorder == null) mRecorder = new MediaRecorder(); mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); mRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA); mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); mRecorder.setOutputFile(path + "/" + vlip); mRecorder.setVideoSize(320, 240); mRecorder.setVideoFrameRate(15); mRecorder.setPreviewDisplay(surfaceHolder.getSurface()); panelPreview.setVisibility(LinearLayout.VISIBLE); mRecorder.prepare(); mRecorder.start(); Here is a complete log trace for the process run and crash: I/ActivityManager( 80): Start proc com.ejf.convince.jenplus for activity com.ejf.convince.jenplus/.JenPLUS: pid=17738 uid=10075 gids={1006, 3003} I/jdwp (17738): received file descriptor 10 from ADB W/System.err(17738): Can't dispatch DDM chunk 46454154: no handler defined W/System.err(17738): Can't dispatch DDM chunk 4d505251: no handler defined I/WindowManager( 80): Screen status=true, current orientation=-1, SensorEnabled=false I/WindowManager( 80): needSensorRunningLp, mCurrentAppOrientation =-1 I/WindowManager( 80): Enabling listeners W/ActivityThread(17738): Application com.ejf.convince.jenplus is waiting for the debugger on port 8100... I/System.out(17738): Sending WAIT chunk I/dalvikvm(17738): Debugger is active I/AlertDialog( 80): [onCreate] auto launch SIP. I/WindowManager( 80): onOrientationChanged, rotation changed to 0 I/System.out(17738): Debugger has connected I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): debugger has settled (1370) I/ActivityManager( 80): Displayed activity com.ejf.convince.jenplus/.JenPLUS: 5186 ms I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/AudioHardwareMSM72XX( 2696): AUDIO_START: start kernel pcm_out driver. W/AudioFlinger( 2696): write blocked for 96 msecs I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 W/AuthorDriver( 2696): Intended width(640) exceeds the max allowed width(352). Max width is used instead. W/AuthorDriver( 2696): Intended height(480) exceeds the max allowed height(288). Max height is used instead. I/AudioHardwareMSM72XX( 2696): AudioHardware pcm playback is going to standby. I/DEBUG (16094): *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** I/DEBUG (16094): Build fingerprint: 'sprint/htc_heroc/heroc/heroc: 1.5/CUPCAKE/85027:user/release-keys' I/DEBUG (16094): pid: 17738, tid: 17738 com.ejf.convince.jenplus Thanks in advance! -- Don Wilde http://www.ConvinceProject.com

    Read the article

  • OpenCV videos across platform

    - by Gurman Gill
    I am writing a video using OpenCV on linux machine. I want to read the same video using OpenCV on a windows machine. I am not able to do this using the standard codecs provided in openCV. Can anybody suggest how I can read/write videos across the two platforms?

    Read the article

  • how to use lightbox2 to show video in my page

    - by Mithun Madhav
    Hi I want to show a popup you tube video player in my web page.I downloaded lighbox from the following link: http://ftp.drupal.org/files/projects/lightbox2-6.x-1.11.zip I extracted the zip file into my sites root I added the following lines in my page header: <link type="text/css" rel="stylesheet" media="all" href="/lightbox2/css/lightbox.css?1" / <script type="text/javascript" src="/lightbox2/js/auto_image_handling.js?1"></script> <script type="text/javascript" src="/lightbox2/js/lightbox_video.js?1"></script> <script type="text/javascript" src="/lightbox2/js/lightbox.js?1"></script> and the following in the body <a href="http://www.youtube.com/watch?v=0gBtF_awV2o" rel=lightvideo[width:500px;height:400px;] <img src="sample" alt="Live TV" But the video opens up in a new tab and not as a popup.Where am i going wrong?I cant find tutorials for this anywhere, although i have this kind of code in many other pages with popup videos.

    Read the article

  • Flash video hiding the cursor on Mac OS X

    - by Tibor
    I found a Core Graphics function, CGCursorIsVisible, that one can use to determine whether the mouse cursor is shown on screen. I have a small app that uses a timer to call that function a few times a second to see if the cursor is shown or hidden (as I found no way to get my app notified when the mouse cursor shown/hidden status changes). The above works correctly except… when the Flash browser plugin plays a video: the mouse cursor gets hidden after a few seconds of mouse inactivity but CGCursorIsVisible() still keeps telling me that the cursor is indeed shown as if nothing happened. What I need is to be able to tell, at any given moment, whether the mouse cursor is shown to the end user on any screen. Is there any way to do that even in the case of playing a video using the Flash plugin?

    Read the article

  • Error Playing video from server with MPMoviePlayerController

    - by skiria
    I try to play a video on the server with this code. I have a error. //Play the video from server - (IBAction)playVideo:(id)sender; { NSLog(@"URLVIDEO %@", aVideo.urlVideo); MPMoviePlayerController *VideoPlayer = [[MPMoviePlayerController alloc]initWithContentURL:[NSURL URLWithString:aVideo.urlVideo]]; [VideoPlayer play]; } //Error on con console Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Content URL must not be nil.' aVideo.urlVideo is a NSString and it's correct when I copy the result on the NSLog on the browser.

    Read the article

  • How to track my flash video player embed on other sites

    - by Dragomir Dan
    Setup: an online tv channel with "youtube" like clips and categories our own flash video player which can be embeded into other remote sites as2 flash player Goal: To track who's embeding my videos, at least with basic statistics per domain. Since it's AS2, it's harder to do this. My ideea is that I can create a php page which should be opened each time the player loads on any website; than the flash player can do a "geturl" of the php file which has Google Analytics code or some other decent tracker. The geturl command could contain a variable like the Video Title which already is included in the player; and this title would pass on with GET to the php file and setup a dynamic page title which can be tracked very well. Problem: how to i use the GETURL function without have user's browser open a new tab or window. Is there any hidden way to do it?

    Read the article

  • Downloading You tube videos?

    - by lalit
    Hi, I want to download you tube videos programatically (using Java). How can i convert you tube video link to a downloadable URL. Browser plays youtube videos with following code. I tried downloading from URL http://www.youtube.com/v/OdAE3cWlmHw but it is not returning video bytes. Thanks Lalit

    Read the article

  • Android MediaStore insertVideo

    - by Robert
    So our app has the option to take either a picture or a video. If the user takes a picture, we can use the MediaStore.Images.Media.insertImage function to add the new image (via a filepath) to the phone's gallery and generate a content:// style URI. Is there a similar process for a captured video, given that we only have it's filepath?

    Read the article

  • FFMPEG based Theora Video Decoder performance??

    - by goldenmean
    Hi, I am in process of porting and optimization of the theora video decoder in the ffmpeg-0.5 package to ARM-Cortex-A8 -Neon processor @ 667 MHz. I am looking for some target estimate for frames per second the decoder library alone should achieve after full optimization (C level and Neon assembly / Intrinsics) for 720x480 Progressive content for a 2Mbps stream. I have a Real Video 9 decoder on cortex-A8 which gives around 40 fps for the same stream above.(720x480, 2Mbps) How can i extrapolate this data based on relative complexities of RV9 and Theora and get a fps estimate for theora decoder Cortex-A8? I am aware the performance depends upon the cache configuration of the h/w, etc...,but any Any pointers will help. Thanks, -AD

    Read the article

  • Cocoa touch SDK 3.2 - How to play video

    - by teepusink
    Hi, How do I play video on SDK 3.2 (iPad)? Read many questions here but they talked mostly for iPhone. For example, the MoviePlayer example here http://developer.apple.com/iphone/library/samplecode/MoviePlayer_iPhone/Introduction/Intro.html That works on 3.1.3 but when I run it on 3.2, it doesn't work. So basically I'm able to play a video on 3.1.3 using this code but the same code won't run on 3.2 NSString *moviePath = [[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent:@"Movie.mp4"]; MPMoviePlayerController *moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL fileURLWithPath:moviePath]]; moviePlayer.movieControlMode = MPMovieControlModeDefault; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(movieFinishedCallback:) name:MPMoviePlayerPlaybackDidFinishNotification object:moviePlayer]; [moviePlayer play]; Thanks, Tee

    Read the article

  • Android App Presentation

    - by Techeretic
    I wanted to make a video presentation of my android application. i know i can make the presentation by holding a camcorder in front of the screen and give walkthrough the application. But is there any other way this can be done, something on the lines of JingProject for windows wherein you can record your activities on your screen in a video. is there a tool that can help me achieve the same on my android device

    Read the article

  • Video/Audio frame as input to OpenCore

    - by Vinay
    I am not able to use MediaPlayer/VideoView to make rtsp to work in Android. So I have created a client to interact with RTSP server, I have succeeded in doing this. I am able to get the video/audio frame from RTSP server (MySpace) in Android. Now I want to play the frames. I have searched OpenCore APIs to play the frames, but didn't get any APIs. My investigation: There is a class PlayerDriver.c It creates two sinks one audio and other video. handleSetVideoSurface handleSetAudioSink Two objects of type PVPlayerDataSinkPVMFNode are created. I suspect this class has a way to give the stream as input, but I am not getting the definition of this class. Can you suggest me is there any class I need to look into it?

    Read the article

  • More complex view matrix calculation required to composite 3d models with 2d video

    - by lzcd
    I'm utilising some 2d / 3d tracking data (provided by pfHoe) to help integrate some 3d models into the playback of some 2d video. Things are working.... okay... but there's still some visible 'slipping' of the models against the video background and I suspect this is may be because the XNA CreatePerspective helper method isn't taking into account some of the additional data supplied by pfHoe such as independent horizontal / vertical field of view angles and focal length. Would anyone be able to point me towards some examples of constructing view matrices that include such details?

    Read the article

  • Android: Displaying Video on VideoView

    - by AndroidDev93
    I'm trying to display a video in my sdcard on the video view. Here is my code: String name = Environment.getExternalStorageDirectory() + "/test.mp4"; final VideoView videoView = (VideoView)findViewById(R.id.videoView1); videoView.setOnPreparedListener(new MediaPlayer.OnPreparedListener() { public void onPrepared(MediaPlayer arg0) { videoView.start(); } }); videoView.setVideoPath(name); The file I am trying to open is called test.mp4 and its located within the sdcard folder. I get an error saying the application has unfortunately stopped. I would appreciate it if someone could help me. Thanks. EDIT : I used the debugger and found out that I get an InvocationTargetException. The detailed message says that : Failure delivering result ResultInfo{who=null, request=1001, result=-1, data=Intent { dat=file:///mnt/sdcard/test.mp4 }} to activity : java.lang.NullPointerException EDIT : I looked at the logcat again and it seems to give the error at videoView.setOnPreparedListener(new MediaPlayer.OnPreparedListener() { I'm guessing either videoView or MediaPlayer is null.

    Read the article

< Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >