Search Results

Search found 10283 results on 412 pages for 'video playback'.

Page 130/412 | < Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >

  • iPhone HTTP Live Streaming not working on models below 3GS

    - by dreamer
    We are using http live streaming for on demand video from within our iPhone app and on the 3GS models the videos play as they are meant to. However, on the models pre 3GS it gives an error saying this movie format is not supported. I have seen other threads on this however no solutions or insights. Does anyone know if this really is a hardware limitation of the pre 3GS phones or does it have something to do with our code?

    Read the article

  • SDL Video Init causes Exception on Mac OS X 10.8

    - by ScrollerBlaster
    I have just ported my C++ game to OS X and the first time it ran I get the following exception when trying to call SDL_SetVideoMode. 2012-09-28 15:01:05.437 SCRAsteroids[28595:707] * Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Error (1000) creating CGSWindow on line 259' * First throw call stack: ( 0 CoreFoundation 0x00007fff8b53b716 __exceptionPreprocess + 198 1 libobjc.A.dylib 0x00007fff90e30470 objc_exception_throw + 43 2 CoreFoundation 0x00007fff8b53b4ec +[NSException raise:format:] + 204 3 AppKit 0x00007fff8a26a579 _NSCreateWindowWithOpaqueShape2 + 655 4 AppKit 0x00007fff8a268d70 -[NSWindow _commonAwake] + 2002 5 AppKit 0x00007fff8a2277e2 -[NSWindow _commonInitFrame:styleMask:backing:defer:] + 1763 6 AppKit 0x00007fff8a22692f -[NSWindow _initContent:styleMask:backing:defer:contentView:] + 1568 7 AppKit 0x00007fff8a2262ff -[NSWindow initWithContentRect:styleMask:backing:defer:] + 45 8 libSDL-1.2.0.dylib 0x0000000107c228f6 -[SDL_QuartzWindow initWithContentRect:styleMask:backing:defer:] + 294 9 libSDL-1.2.0.dylib 0x0000000107c20505 QZ_SetVideoMode + 2837 10 libSDL-1.2.0.dylib 0x0000000107c17af5 SDL_SetVideoMode + 917 11 SCRAsteroids 0x0000000107be60fb _ZN11SDLGraphics4initEP6IWorldii + 291 ) libc++abi.dylib: terminate called throwing an exception Abort trap: 6 My init code looks like this: if (SDL_Init(SDL_INIT_EVERYTHING) < 0) return false; const SDL_VideoInfo *videoInfo = SDL_GetVideoInfo(); if (!videoInfo) { fprintf(stderr, "Video query failed: %s\n", SDL_GetError()); return false; } /* the flags to pass to SDL_SetVideoMode */ videoFlags = SDL_OPENGL; /* Enable OpenGL in SDL */ videoFlags |= SDL_GL_DOUBLEBUFFER; /* Enable double buffering */ videoFlags |= SDL_HWPALETTE; /* Store the palette in hardware */ /* This checks to see if surfaces can be stored in memory */ if (videoInfo->hw_available) videoFlags |= SDL_HWSURFACE; else videoFlags |= SDL_SWSURFACE; if (w == 0) { widthViewport = videoInfo->current_w; heightViewport = videoInfo->current_h; cout << "Will use full screen resolution of "; videoFlags |= SDL_FULLSCREEN; } else { cout << "Will use full user supplied resolution of "; widthViewport = w; heightViewport = h; videoFlags |= SDL_RESIZABLE; /* Enable window resizing */ } cout << widthViewport << "x" << heightViewport << "\n"; /* This checks if hardware blits can be done */ if (videoInfo->blit_hw) videoFlags |= SDL_HWACCEL; /* Sets up OpenGL double buffering */ SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1); /* get a SDL surface */ surface = SDL_SetVideoMode(widthViewport, heightViewport, SCREEN_BPP, videoFlags); It gets into that last SDL call and throws the exception above. I have tried it in both full screen and resizable window mode, same thing. I build my app old school, on the command line, as opposed to using Xcode.

    Read the article

  • addchild not displaying content

    - by Rajeev
    In the following code i dont have any error but why is that the addchild(video); i.e, the the video captured by webcam is not displayed <?xml version="1.0" encoding="utf-8"?> <mx:Application xmlns:mx="http://www.adobe.com/2006/mxml" layout="absolute"> <mx:Script> <![CDATA[ import org.com.figurew; import mx.controls.Button; import mx.controls.Alert; import flash.display.InteractiveObject; import flash.display.Sprite; import flash.media.*; import flash.net.*; public function addBody():void { var ret:Number = figurew.getInstance().getparam(); if( ret == 1) { Alert.show("Camera detected"); } if(ret == 0) { Alert.show("No camera detected"); } var cam:Camera = Camera.getCamera(); if(cam != null) { cam.setMode(640, 480, 30); var video:Video = new Video(30, 40); video.attachCamera(cam); addChild(video); } else { trace("No Camera Detected"); } } ]]> </mx:Script> <mx:Button label="Test camera" click="addBody();" x="99" y="116"/> </mx:Application > figurew.as package org.com { import flash.display.InteractiveObject; import flash.display.Sprite; import flash.media.*; import flash.net.*; public class figurew extends Sprite { public function figurew() { //getparam(); var cam:Camera = Camera.getCamera(); if(cam != null) { cam.setMode(640, 480, 30); var video:Video = new Video(300, 450); video.attachCamera(cam); addChild(video); } else { trace("No Camera Detected"); } } public function getparam():Number { var cam:Camera = Camera.getCamera(); if(cam != null) { cam.setMode(640, 480, 30); var video:Video = new Video(300, 450); video.attachCamera(cam); addChild(video); return 1; } else { return 0; trace("No Camera Detected"); } } private static var _instance:figurew = null; public static function getInstance():cldAS { if(_instance == null) { trace("No instance found"); _instance = new cldAS(); } return _instance; } } }

    Read the article

  • Does the size of monitor Matters ?

    - by Arsheep
    I have a old computer , i want to buy a big LCD now the best i can found is Viewsonic's 24" lcd TFT monitor . So will it run without any problems or i need to upgrade the video cards or something too ? The computer is not that much old it has P4 bord and celeron processor with 128 graphics memory . And in properties it shows i can maximum use 1280 x 1024 resolution. I am noob hardware wise So need help on this stuff. Thanks

    Read the article

  • HTML tag to select Flash-player plugin to play flv

    - by Microkernel
    Hi all, If I have .flv files on my server at some location like http://www.example.com/archive/test.flv. In HTML page how can I tell the browser to use Flash player to play this video like youtube videos are played. Can someone tell me how to do this? Regards, Microkernel PS: I am noober than noob in web dev, so please give some code snippet or answers that beginners like me can understand. Thank you

    Read the article

  • MPMoviePlayerContentPreloadDidFinishNotification seems more reliable than MPMoviePlayerLoadStateDidChangeNotification

    - by user567889
    I am streaming small movies (1-3MB) off my website into my app. I have a slicehost webserver, I think it's a "500MB slice". Not sure off the top of my head how this translates to bandwidth, but I can figure that out later. My experience with MPMoviePlayerLoadStateDidChangeNotification is not very good. I get much more reliable results with the old MPMoviePlayerContentPreloadDidFinishNotification If I get a MPMoviePlayerContentPreloadDidFinishNotification, the movie will play without stuttering, but if I use MPMoviePlayerLoadStateDidChangeNotification, the movie frequently stalls. I'm not sure which load state to check for: enum { MPMovieLoadStateUnknown = 0, MPMovieLoadStatePlayable = 1 << 0, MPMovieLoadStatePlaythroughOK = 1 << 1, MPMovieLoadStateStalled = 1 << 2, }; MPMovieLoadStatePlaythroughOK seems to be what I want (based on the description in the documentation): MPMovieLoadStatePlaythroughOK Enough data has been buffered for playback to continue uninterrupted. Available in iOS 3.2 and later. but that load state NEVER gets set to this in my app. Am I missing something? Is there a better way to do this?

    Read the article

  • How can I delete video stored in the photo library ?

    - by srikanth rongali
    I have saved video in to the photo library. -(void)exportVideo:(id)sender { NSString *path = [DOCUMENTS_FOLDER stringByAppendingString:@"/air.mp4"]; NSLog(@"Path:%@", path); NSLog(@"Export Button CLicked"); UISaveVideoAtPathToSavedPhotosAlbum(path, self, @selector(video:didFinishSavingWithError: contextInfo:), nil); } - (void)video:(NSString *)videoPath didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo { NSLog(@"Finished saving video with error: %@", error); } Now I need to delete the video i have stored programmatically. How can I delete the video ? Are there any functions for it? Thank You.

    Read the article

  • DVB-T tune request parameters

    - by dkb
    I'm working in GraphEdit to capture video from a Pinnacle PCTV card, in preparation for writing a program that will do the same, among other things. What, precisely, is the meaning of the tune request parameters? How do I find the appropriate values to use, and is there a way to do so automatically? (ie, not to have to manually change params in the final program when trying to watch DVB in different countries.)

    Read the article

  • HTML5 Playlist plays 2 videos, How about 4 or 5?

    - by amber
    I'm able to get two videos to play sequentially, (and without pause!) with this code from Apple, (see section 2-4)... http://developer.apple.com/safari/library/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/ControllingMediaWithJavaScript/ControllingMediaWithJavaScript.html ...Yet completely lost as to how to play a 3rd or 5th video. Trouble is I'm a Javascript noob :-(, so if you figure this out please share as much of your code as possible. Thanks much!

    Read the article

  • does channel 9 have subtitle [closed]

    - by jciwolf
    hello everyone you know msdn channel 9 has a lot of excellent video about wcf ,silverlight and so on ,but what let me down is that I am not very good at english,and the show is speak quickly sometime so I can't understand what they say where I can find the subtitle

    Read the article

  • tool to find out distance in terms of no. of hops in unix

    - by mawia
    Hi! all, I am writing an application for video streaming.In the application server is required to know the distance of the client from it self in terms of hop number.My question is,is there any tool/method other than traceroute available in unix environment to find it? I also need to find out the geographical location of the client.So is their any tool/method for this as well? Any help in this regard will be highly appreciated. Thanks in advance. Mawia

    Read the article

  • html5media library doesn't work on FF 3.6.3

    - by Alex
    Hi. Am I the only one experiencing this issue? I'm using the html5media library and the test page they provide no longer plays in Firefox 3.6.3, though it plays on the latest Safari, Chrome, Opera, and IE. On FF 3.6.3, it shows the video and the audio with large X through them. I'm using this library on my site and noticed the issue as well. I'm not seeing any errors in the error console.

    Read the article

  • Android: transparent videos

    - by Danail
    Hi guys, I was wondering if it is possible to run transparent video? (the background should be something else, like a picture, or view, shown from the camera, etc.)? Any idea if this is possible and how it's done? (should I use openGL, or smthn else)? 10x in advance, Danail

    Read the article

  • Threads to make video out of images

    - by masood
    updates: I think/ suspect the imageIO is not thread safe. shared by all threads. the read() call might use resources that are also shared. Thus it will give the performance of a single thread no matter how many threads used. ? if its correct . what is the solution (in practical code) Single request and response model at one time do not utilizes full network/internet bandwidth, thus resulting in low performance. (benchmark is of half speed utilization or even lower) This is to make a video out of an IP cam that gives a new image on each request. http://149.5.43.10:8001/snapshot.jpg It makes a delay of 3 - 8 seconds no matter what I do. Changed thread no. and thread time intervals, debugged the code by System.out.println statements to see if threads work. All seems normal. Any help? Please show some practical code. You may modify mine. This code works (javascript) with much smoother frame rate and max bandwidth usage. but the later code (java) dont. same 3 to 8 seconds gap. <!DOCTYPE html> <html> <head> <script type="text/javascript"> (function(){ var img="/*url*/"; var interval=50; var pointer=0; function showImg(image,idx) { if(idx<=pointer) return; document.body.replaceChild(image,document.getElementsByTagName("img")[0]); pointer=idx; preload(); } function preload() { var cache=null,idx=0;; for(var i=0;i<5;i++) { idx=Date.now()+interval*(i+1); cache=new Image(); cache.onload=(function(ele,idx){return function(){showImg(ele,idx);};})(cache,idx); cache.src=img+"?"+idx; } } window.onload=function(){ document.getElementsByTagName("img")[0].onload=preload; document.getElementsByTagName("img")[0].src="/*initial url*/"; }; })(); </script> </head> <body> <img /> </body> </html> and of java (with problem) : package camba; import java.applet.Applet; import java.awt.Button; import java.awt.Graphics; import java.awt.Image; import java.awt.Label; import java.awt.Panel; import java.awt.TextField; import java.awt.event.ActionEvent; import java.awt.event.ActionListener; import java.net.URL; import java.security.Timestamp; import java.util.Date; import java.util.concurrent.TimeUnit; import java.util.concurrent.atomic.AtomicBoolean; import javax.imageio.ImageIO; public class Camba extends Applet implements ActionListener{ Image img; TextField textField; Label label; Button start,stop; boolean terminate = false; long viewTime; public void init(){ label = new Label("please enter camera URL "); add(label); textField = new TextField(30); add(textField); start = new Button("Start"); add(start); start.addActionListener(this); stop = new Button("Stop"); add(stop); stop.addActionListener(this); } public void actionPerformed(ActionEvent e){ Button source = (Button)e.getSource(); if(source.getLabel() == "Start"){ for (int i = 0; i < 7; i++) { myThread(50*i); } System.out.println("start..."); } if(source.getLabel() == "Stop"){ terminate = true; System.out.println("stop..."); } } public void paint(Graphics g) { update(g); } public void update(Graphics g){ try{ viewTime = System.currentTimeMillis(); g.drawImage(img, 100, 100, this); } catch(Exception e) { e.printStackTrace(); } } public void myThread(final int sleepTime){ new Thread(new Runnable() { public void run() { while(!terminate){ try { TimeUnit.MILLISECONDS.sleep(sleepTime); } catch (InterruptedException ex) { ex.printStackTrace(); } long requestTime= 0; Image tempImage = null; try { URL pic = null; requestTime= System.currentTimeMillis(); pic = new URL(getDocumentBase(), textField.getText()); tempImage = ImageIO.read(pic); } catch(Exception e) { e.printStackTrace(); } if(requestTime >= /*last view time*/viewTime){ img = tempImage; Camba.this.repaint(); } } }}).start(); System.out.println("thread started..."); } }

    Read the article

  • asus n550jv audio problem: no sound from notebook' speakers

    - by skywalker
    Ubuntu 13.10. The problem is: the internal speakers don't work. I have no problem when I'm using the headphones. There is no hardware issue since in windows 8 everything works perfectly(external subwoofer included). I'm trying to modify /etc/modprobe.d/alsa-base.conf but I can't find the correct model to put into: options snd-hda-intel model= The file HD-Audio-Models.txt doesn't contain the model for ALC668. Some info: :~sudo aplay -l **** List of PLAYBACK Hardware Devices **** card 0: MID [HDA Intel MID], device 3: HDMI 0 [HDMI 0] Subdevices: 1/1 Subdevice #0: subdevice #0 card 0: MID [HDA Intel MID], device 7: HDMI 1 [HDMI 1] Subdevices: 1/1 Subdevice #0: subdevice #0 card 0: MID [HDA Intel MID], device 8: HDMI 2 [HDMI 2] Subdevices: 1/1 Subdevice #0: subdevice #0 card 1: PCH [HDA Intel PCH], device 0: ALC668 Analog [ALC668 Analog] Subdevices: 0/1 Subdevice #0: subdevice #0 :~$ sudo lspci -v | grep -A7 -i "audio" 00:03.0 Audio device: Intel Corporation Xeon E3-1200 v3/4th Gen Core Processor HD Audio Controller (rev 06) Subsystem: Intel Corporation Device 2010 Flags: bus master, fast devsel, latency 0, IRQ 52 Memory at f7a14000 (64-bit, non-prefetchable) [size=16K] Capabilities: [50] Power Management version 2 Capabilities: [60] MSI: Enable+ Count=1/1 Maskable- 64bit- Capabilities: [70] Express Root Complex Integrated Endpoint, MSI 00 Kernel driver in use: snd_hda_intel -- 00:1b.0 Audio device: Intel Corporation 8 Series/C220 Series Chipset High Definition Audio Controller (rev 04) Subsystem: ASUSTeK Computer Inc. Device 11cd Flags: bus master, fast devsel, latency 0, IRQ 53 Memory at f7a10000 (64-bit, non-prefetchable) [size=16K] Capabilities: [50] Power Management version 2 Capabilities: [60] MSI: Enable+ Count=1/1 Maskable- 64bit+ Capabilities: [70] Express Root Complex Integrated Endpoint, MSI 00 Capabilities: [100] Virtual Channel PS info :~$ amixer -c 0 Simple mixer control 'IEC958',0 Capabilities: pswitch pswitch-joined Playback channels: Mono Mono: Playback [on] Simple mixer control 'IEC958',1 Capabilities: pswitch pswitch-joined Playback channels: Mono Mono: Playback [on] Simple mixer control 'IEC958',2 Capabilities: pswitch pswitch-joined Playback channels: Mono Mono: Playback [on] :~$ pacmd dump-volumes Welcome to PulseAudio! Use "help" for usage information. Sink 0: reference = 0: 76% 1: 76%, real = 0: 76% 1: 76%, soft = 0: 100% 1: 100%, current_hw = 0: 76% 1: 76%, save = yes Input 8: volume = 0: 100% 1: 100%, reference_ratio = 0: 100% 1: 100%, real_ratio = 0: 100% 1: 100%, soft = 0: 100% 1: 100%, volume_factor = 0: 100% 1: 100%, volume_factor_sink = 0: 100% 1: 100%, save = no Source 0: reference = 0: 100% 1: 100%, real = 0: 100% 1: 100%, soft = 0: 100% 1: 100%, current_hw = 0: 100% 1: 100%, save = no Source 1: reference = 0: 16% 1: 16%, real = 0: 16% 1: 16%, soft = 0: 100% 1: 100%, current_hw = 0: 16% 1: 16%, save = yes

    Read the article

  • How do I capture and playback http web requests against multiple web servers?

    - by KevM
    My overall goal is to not interrupt a production system while capturing HTTP Posts to a web application so that I can reverse engineer the telemetry coming from a closed application. I have control over the transmitter of the HTTP Posts but not the receiving web application. It seems like I need a request "forking" proxy. Sort of a reverse proxy that pushes the request to 2 endpoints, a master and slave, only relaying the response from the master endpoint back to the requester. I am not a server geek so something like this may exist but I don't know the term of art for what I am looking for. Another possibility could be a simple logging proxy. Capture a log of the web requests. Rewrite the log to target my "slave" web application. Playback the log with curl or something. Thank you for your assistance.

    Read the article

< Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >