Search Results

Search found 1636 results on 66 pages for 'streaming'.

Page 28/66 | < Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >

  • how to play video from url

    - by priyanka
    i am biginner in android development and try to play video from link but it's give error "sorry,we can't play this video" i tride so many links but for all links its show same error. My code it follwing public class VideoDemo extends Activity { private static final String path ="http://demo.digi-corp.com/S2LWebservice/Resources/SampleVideo.mp4"; private VideoView video; private MediaController ctlr; @Override public void onCreate(Bundle icicle) { super.onCreate(icicle); getWindow().setFormat(PixelFormat.TRANSLUCENT); setContentView(R.layout.videoview); video = (VideoView) findViewById(R.id.video); video.setVideoPath(path); ctlr = new MediaController(this); ctlr.setMediaPlayer(video); video.setMediaController(ctlr); video.requestFocus(); } } thansk in advance

    Read the article

  • How do continuously update data to an asp page?

    - by Lori
    Hi, I have an asp page based on a very simple database. It references a single table of probably 30 records and maybe 12 data fields and everything works great as I am only uploading a new database every week or so. I have a special circumstance where I would like upload new data to the database and display automatically on the page every 20 to 30 seconds without the user having to refresh their screen. I would expect up to 1000 concurrent users accessing the data. I have been manually uploading the database via ftp, which will obviously not work on this timeline and would also run the risk of error pages as the database is being replaced. So, can anyone point me the right direction to setup this scenario? Other details that might be helpful: The database is an Access database (but I could change to another format if needed) Running on Windows platform hosted by an ISP, not my own server Thanks in advance for any help on this! Lori

    Read the article

  • Android RTSP coding problem

    - by NetApex
    I have Googled my butt off trying to find where if there is a surefire way to make rtsp work. I have a radio station that I listen to that streams via rtsp. Of course by default Android doesn't want to play it. If I pop the URL into yourmuze.fm and create a station there it lets me stream it to my phone. After checking how it works I come to find that it streams to the phone via rtsp! So obviously there is something amiss. What makes one stream work and one not? This is the stream I am attempting : rtsp://wms2.christiannetcast.com/yes-fm It is an audio stream so I would be thrilled with most peoples problem of "it only does audio and not video." When yourmuze.fm streams, DDMS states it brings up MovieView to play the audio if that helps at all.

    Read the article

  • Use a "x-dom-event-stream" stream in javascript ?

    - by rnaud
    Hello, HTML5 draft contains an API called EventSource to stream data (notifications) trough javascript using only one server call. Looking it up, I found an exemple on Opera Labs of the javascript part : document.getElementsByTagName("event-source")[0] .addEventListener("server-time", eventHandler, false); function eventHandler(event) { // Alert time sent by the server alert(event.data); } and the server side part : <?php header("Content-Type: application/x-dom-event-stream"); while(true) { echo "Event: server-time\n"; $time = time(); echo "data: $time\n"; echo "\n"; flush(); sleep(3); } ?> But as of today, it seems only Opera has implemented the API, neither Chrome nor Safari have a working version (Am I wrong here ?) So my question is, is there any other way in javascript, maybe more complex, to use this one stream to get data ?

    Read the article

  • Actionscript: NetStream stutters after buffering.

    - by meandmycode
    Using NetStream to stream content from http, I've noticed that esp with certain exported h264's, if the player encounters an empty buffer, it will stop and buffer to the requested length (as expected). However once the buffer is full, the playback doesn't resume, but instead jumps ahead, as such- instantly playing the buffered duration in a brief moment, and thusly triggering an empty buffer again.. this will then continue over and over. Presumably when the netstream pauses to buffer, the playhead position continues, and the player is attempting to snap to that position on resume- however given it could take 5 seconds to build a 2 second buffer- it ends up with a useless buffer again.. (this is an assumption) I've attempted to work around this by listening for an empty buffer netstatus event, pausing the stream, and at the same time setting up a loop to check the current buffer length vs the requested buffer length.. and resuming once the buffer length is greater than or equal to the requested buffer.. however this causes problems when there isn't enough of the video remaining.. for example, a 10 second buffer with only 5 seconds remaining, the loop just sits there waiting for a buffer length of 10 seconds when theres only 5 left... You would think that you could simply check which was smaller, the time left or the requested buffer length.. however the times flash gives are not accurate.. If you add the net streams current time index, plus the buffered time, the total is not the entire duration of the movie (when at the end).. it is close but not the same. This brings me back to the original problem, and if there is another way to fix this, clearly flash knows when the buffer is ready, so how can i get flash pause when it buffers, and resume once the buffer is ready? currently it doesn't.. it pauses and then once the buffer is full- it plays the entire buffered content in about .1 of a second. Thanks in advance, Stephen.

    Read the article

  • capture webcam stream and send it over network using DirectShow.net

    - by SR Dusad
    Hi all, I m working on a video conference project in vs2010 with c#. I m able to capture the snaps from web cam in picture box with the help of Directshownet samples available on SouceForge.But i can't find any proper solution for capturing the audio/video stream directly from web cam using Directshownet. If anybody know about this problem's solution, pls give me ur advise . Waitng for ur response ...

    Read the article

  • hosting environment for delivering FLVs [closed]

    - by Gotys
    What would be the ideal hardware setup for pushing lots of bandwith on a tube site? We have ever-expanding cloud storage where users upload the movies, then we have these web-delivery machines which cache the FLV files on its local harddrives and deliver them to users. Each cache machine can deliver 1200 mbits/s , if it has SAS 8 harddrives. Such a cache machine costs us $550/month for 8x160gb -- so each machine can cache only 160GB at any given time. If we want to cache more then 160gb , we need to add another machine..another $550/month..etc. This is very un-economical so I am wondering if we have any experts here who can figure out a better setup. I've been looking into "gluster FS", but I am not sure if this thing can push a lot of bandwith. Any ideas highly appreciated. Thank you!

    Read the article

  • Is there a way to make PHP progressively output as the script executes?

    - by Iain Fraser
    So I'm writing a disposable script for my own personal single use and I want to be able see how the process is going. Basically I'm processing a couple of thousand media releases and sending them to our new CMS. So I don't hammer the CMS, I'm making the script sleep for a couple of seconds after every 5 requests. I would like - as the script is executing - to be able to see my echos telling me the script is going to sleep or that the last transaction with the webservice was successful. Is this possible in PHP? Thanks for your help! Iain

    Read the article

  • Video encoding Help

    - by Pedro
    Hi guys, I'm doing one research on video encoding tools for flv. I tested flvtool2 and Yamddi, but I'm losing lots of quality of video. Does anyone recommend any other tool or algorithm to keep the maximum quality of the movie in flv? Regards, Pedro

    Read the article

  • Man pages for libvlc

    - by mawia
    Hi! all. Though I am not sure wether this question belongs to here but pardon me if not so. Can you people guide me to the manual page of libvlc functions like(just give a kind of pointer where these functions are described in detail) void libvlc_playlist_pause( libvlc_instance *, libvlc_exception ) mtime_t libvlc_input_get_length( libvlc_input_t *, libvlc_exception ) mtime_t libvlc_input_get_time( libvlc_input_t *, libvlc_exception ) void libvlc_input_set_time( libvlc_input_t *, mtime_t , libvlc_exception ) float libvlc_input_get_position( libvlc_input_t *, libvlc_exception ) void libvlc_input_set_position( libvlc_input_t *, float , libvlc_exception ) void libvlc_set_rate( libvlc_input_t *, float rate, libvlc_exception ) float libvlc_get_rate( libvlc_input_t *, libvlc_exception ) libvlc_input_get_information( libvlc_input_t *, libvlc_exception ) In particular can you please describe the functioning of libvlc_playlist_pause.I am using this in my aplication to run a video stream.My video is running but since video file is coming over a network I need to pause the player for a particular amount till enough data is buffered. With regards Mawia

    Read the article

  • cancel stream request from WCF server to client

    - by ArsenMkrt
    Hi, I posted about stream request here [wcf-chunk-data-with-stream]:http://stackoverflow.com/questions/853448/wcf-chunk-data-with-stream I solved that task but now when i close request in client part server continue to send data. is it possible to cancel stream request from WCF server to client?

    Read the article

  • How to stream your images/files with VLC? (C# .Net)

    - by Ole Jak
    So I know there are lot of wrapers of libVLC.dll . But I just do not know what one is ready to do what I need... What I need is simple... in my C# programm I create some bitmap (once or twice per second)... I now want to stream bitmaps live as video (in some format VLC can to offer me) to some http:localhost:port/ using VLC... What libvlc.dll wrapper can help me with that?

    Read the article

  • Displaying Video using a Window Handle

    - by fergs
    I'm working on a C# wrapper for Dallmeier camera's and currently have a working wrapper. I can connect to a camera via passing the window handle (in my application its a picture box handle), this is used to send video and messages. Once connected I can then send the StartLiveView command and then a live stream video will be shown in the picture box. Can someone explain how this works by just giving the window handle? And how can I grab an Image from this stream when Picturebox1.Image is null?

    Read the article

  • DirectShow: Video-Preview and Image (with working code)

    - by xsl
    Questions / Issues If someone can recommend me a good free hosting site I can provide the whole project file. As mentioned in the text below the TakePicture() method is not working properly on the HTC HD 2 device. It would be nice if someone could look at the code below and tell me if it is right or wrong what I'm doing. Introduction I recently asked a question about displaying a video preview, taking camera image and rotating a video stream with DirectShow. The tricky thing about the topic is, that it's very hard to find good examples and the documentation and the framework itself is very hard to understand for someone who is new to windows programming and C++ in general. Nevertheless I managed to create a class that implements most of this features and probably works with most mobile devices. Probably because the DirectShow implementation depends a lot on the device itself. I could only test it with the HTC HD and HTC HD2, which are known as quite incompatible. HTC HD Working: Video preview, writing photo to file Not working: Set video resolution (CRASH), set photo resolution (LOW quality) HTC HD 2 Working: Set video resolution, set photo resolution Problematic: Video Preview rotated Not working: Writing photo to file To make it easier for others by providing a working example, I decided to share everything I have got so far below. I removed all of the error handling for the sake of simplicity. As far as documentation goes, I can recommend you to read the MSDN documentation, after that the code below is pretty straight forward. void Camera::Init() { CreateComObjects(); _captureGraphBuilder->SetFiltergraph(_filterGraph); InitializeVideoFilter(); InitializeStillImageFilter(); } Dipslay a video preview (working with any tested handheld): void Camera::DisplayVideoPreview(HWND windowHandle) { IVideoWindow *_vidWin; _filterGraph->QueryInterface(IID_IMediaControl,(void **) &_mediaControl); _filterGraph->QueryInterface(IID_IVideoWindow, (void **) &_vidWin); _videoCaptureFilter->QueryInterface(IID_IAMVideoControl, (void**) &_videoControl); _captureGraphBuilder->RenderStream(&PIN_CATEGORY_PREVIEW, &MEDIATYPE_Video, _videoCaptureFilter, NULL, NULL); CRect rect; long width, height; GetClientRect(windowHandle, &rect); _vidWin->put_Owner((OAHWND)windowHandle); _vidWin->put_WindowStyle(WS_CHILD | WS_CLIPSIBLINGS); _vidWin->get_Width(&width); _vidWin->get_Height(&height); height = rect.Height(); _vidWin->put_Height(height); _vidWin->put_Width(rect.Width()); _vidWin->SetWindowPosition(0,0, rect.Width(), height); _mediaControl->Run(); } HTC HD2: If set SetPhotoResolution() is called FindPin will return E_FAIL. If not, it will create a file full of null bytes. HTC HD: Works void Camera::TakePicture(WCHAR *fileName) { CComPtr<IFileSinkFilter> fileSink; CComPtr<IPin> stillPin; CComPtr<IUnknown> unknownCaptureFilter; CComPtr<IAMVideoControl> videoControl; _imageSinkFilter.QueryInterface(&fileSink); fileSink->SetFileName(fileName, NULL); _videoCaptureFilter.QueryInterface(&unknownCaptureFilter); _captureGraphBuilder->FindPin(unknownCaptureFilter, PINDIR_OUTPUT, &PIN_CATEGORY_STILL, &MEDIATYPE_Video, FALSE, 0, &stillPin); _videoCaptureFilter.QueryInterface(&videoControl); videoControl->SetMode(stillPin, VideoControlFlag_Trigger); } Set resolution: Works great on HTC HD2. HTC HD won't allow SetVideoResolution() and only offers one low resolution photo resolution: void Camera::SetVideoResolution(int width, int height) { SetResolution(true, width, height); } void Camera::SetPhotoResolution(int width, int height) { SetResolution(false, width, height); } void Camera::SetResolution(bool video, int width, int height) { IAMStreamConfig *config; config = NULL; if (video) { _captureGraphBuilder->FindInterface(&PIN_CATEGORY_PREVIEW, &MEDIATYPE_Video, _videoCaptureFilter, IID_IAMStreamConfig, (void**) &config); } else { _captureGraphBuilder->FindInterface(&PIN_CATEGORY_STILL, &MEDIATYPE_Video, _videoCaptureFilter, IID_IAMStreamConfig, (void**) &config); } int resolutions, size; VIDEO_STREAM_CONFIG_CAPS caps; config->GetNumberOfCapabilities(&resolutions, &size); for (int i = 0; i < resolutions; i++) { AM_MEDIA_TYPE *mediaType; if (config->GetStreamCaps(i, &mediaType, reinterpret_cast<BYTE*>(&caps)) == S_OK ) { int maxWidth = caps.MaxOutputSize.cx; int maxHeigth = caps.MaxOutputSize.cy; if(maxWidth == width && maxHeigth == height) { VIDEOINFOHEADER *info = reinterpret_cast<VIDEOINFOHEADER*>(mediaType->pbFormat); info->bmiHeader.biWidth = maxWidth; info->bmiHeader.biHeight = maxHeigth; info->bmiHeader.biSizeImage = DIBSIZE(info->bmiHeader); config->SetFormat(mediaType); DeleteMediaType(mediaType); break; } DeleteMediaType(mediaType); } } } Other methods used to build the filter graph and create the COM objects: void Camera::CreateComObjects() { CoInitialize(NULL); CoCreateInstance(CLSID_CaptureGraphBuilder, NULL, CLSCTX_INPROC_SERVER, IID_ICaptureGraphBuilder2, (void **) &_captureGraphBuilder); CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC_SERVER, IID_IGraphBuilder, (void **) &_filterGraph); CoCreateInstance(CLSID_VideoCapture, NULL, CLSCTX_INPROC, IID_IBaseFilter, (void**) &_videoCaptureFilter); CoCreateInstance(CLSID_IMGSinkFilter, NULL, CLSCTX_INPROC, IID_IBaseFilter, (void**) &_imageSinkFilter); } void Camera::InitializeVideoFilter() { _videoCaptureFilter->QueryInterface(&_propertyBag); wchar_t deviceName[MAX_PATH] = L"\0"; GetDeviceName(deviceName); CComVariant comName = deviceName; CPropertyBag propertyBag; propertyBag.Write(L"VCapName", &comName); _propertyBag->Load(&propertyBag, NULL); _filterGraph->AddFilter(_videoCaptureFilter, L"Video Capture Filter Source"); } void Camera::InitializeStillImageFilter() { _filterGraph->AddFilter(_imageSinkFilter, L"Still image filter"); _captureGraphBuilder->RenderStream(&PIN_CATEGORY_STILL, &MEDIATYPE_Video, _videoCaptureFilter, NULL, _imageSinkFilter); } void Camera::GetDeviceName(WCHAR *deviceName) { HRESULT hr = S_OK; HANDLE handle = NULL; DEVMGR_DEVICE_INFORMATION di; GUID guidCamera = { 0xCB998A05, 0x122C, 0x4166, 0x84, 0x6A, 0x93, 0x3E, 0x4D, 0x7E, 0x3C, 0x86 }; di.dwSize = sizeof(di); handle = FindFirstDevice(DeviceSearchByGuid, &guidCamera, &di); StringCchCopy(deviceName, MAX_PATH, di.szLegacyName); } Full header file: #ifndef __CAMERA_H__ #define __CAMERA_H__ class Camera { public: void Init(); void DisplayVideoPreview(HWND windowHandle); void TakePicture(WCHAR *fileName); void SetVideoResolution(int width, int height); void SetPhotoResolution(int width, int height); private: CComPtr<ICaptureGraphBuilder2> _captureGraphBuilder; CComPtr<IGraphBuilder> _filterGraph; CComPtr<IBaseFilter> _videoCaptureFilter; CComPtr<IPersistPropertyBag> _propertyBag; CComPtr<IMediaControl> _mediaControl; CComPtr<IAMVideoControl> _videoControl; CComPtr<IBaseFilter> _imageSinkFilter; void GetDeviceName(WCHAR *deviceName); void InitializeVideoFilter(); void InitializeStillImageFilter(); void CreateComObjects(); void SetResolution(bool video, int width, int height); }; #endif

    Read the article

  • How to use RTPSocket to send RTP packets

    - by Afro Genius
    Hi there, am relatively new to JMF but have gone through the documents and have a sufficient understanding of how it works. That been said am having some trouble implementing a the server side for RTPSockets. After looking at their illustrations and example. I am still abit confused. Am I to develop a datasource and also datasink classes to handle the transfer? What am trying to do is stream data from my application to the underlying network and receive it back through another application. I have and understand receiving but just can't get my head around the steps involved for sending. Any help would be most appreciated.

    Read the article

  • Dividing a Video into Frames and Sending Frames to Streams

    - by Amit Kumar
    I have to implement a "demux" that divides up a video stream and sends each frame to one of multiple output streams in a round-robin fashion. I am trying to implement the demux as follows. The video stream contains one frame after another and is implemented via a java InputStream. Each frame has a frame header followed by the image data. The demux needs to read the frame header to know the size of the image data. The image data can then be redirected from the input video stream to one of the output streams (java OutputStream). My problem is about how to implement this redirection. That is, connect the InputStream to the OutputStream to send N bytes (here N is the size of the image data), and then disconnect and connect to another OutputStream. I have seen the interface of PipedInputStream etc but they do not seem to implement the disconnection.

    Read the article

  • synchronizing audio over a network

    - by sharkin
    I'm in startup of designing a client/server audio system which can stream audio arbitrarily over a network. One central server pumps out an audio stream and x number of clients receives the audio data and plays it. So far no magic needed and I have even got this scenario to work with VLC media player out of the box. However, the tricky part seems to be synchronizing the audio playback so that all clients are in audible synch (actual latency can be allowed as long as it is perceived to be in sync by a human listener). My question is if there's any known method or algorithm to use for these types of synchronization problems (video is probably solved the same way). My own initial thoughts centers around synchronizing clocks between physical machines and thereby creating a virtual "main timer" and somehow aligning audio data packets against it. Some products already solving the problem: http://www.sonos.com http://netchorus.com/ Any pointers are most welcome. Thanks. PS: This related question seem to have died long ago.

    Read the article

  • record output sound in python

    - by aaronstacy
    i want to programatically record sound coming out of my laptop in python. i found PyAudio and came up with the following program that accomplishes the task: import pyaudio, wave, sys chunk = 1024 FORMAT = pyaudio.paInt16 CHANNELS = 1 RATE = 44100 RECORD_SECONDS = 5 WAVE_OUTPUT_FILENAME = sys.argv[1] p = pyaudio.PyAudio() channel_map = (0, 1) stream_info = pyaudio.PaMacCoreStreamInfo( flags = pyaudio.PaMacCoreStreamInfo.paMacCorePlayNice, channel_map = channel_map) stream = p.open(format = FORMAT, rate = RATE, input = True, input_host_api_specific_stream_info = stream_info, channels = CHANNELS) all = [] for i in range(0, RATE / chunk * RECORD_SECONDS): data = stream.read(chunk) all.append(data) stream.close() p.terminate() data = ''.join(all) wf = wave.open(WAVE_OUTPUT_FILENAME, 'wb') wf.setnchannels(CHANNELS) wf.setsampwidth(p.get_sample_size(FORMAT)) wf.setframerate(RATE) wf.writeframes(data) wf.close() the problem is i have to connect the headphone jack to the microphone jack. i tried replacing these lines: input = True, input_host_api_specific_stream_info = stream_info, with these: output = True, output_host_api_specific_stream_info = stream_info, but then i get this error: Traceback (most recent call last): File "./test.py", line 25, in data = stream.read(chunk) File "/Library/Python/2.5/site-packages/pyaudio.py", line 562, in read paCanNotReadFromAnOutputOnlyStream) IOError: [Errno Not input stream] -9975 is there a way to instantiate the PyAudio stream so that it inputs from the computer's output and i don't have to connect the headphone jack to the microphone? is there a better way to go about this? i'd prefer to stick w/ a python app and avoid cocoa.

    Read the article

  • Videoconference using Flash and SIP

    - by Júlio Santos
    The front-end will be Flash, to run in a browser and have access to the camera. I must use SIP to control the sessions. How could I do this? Will a Red5 server and a MjSip sever do the trick? As in i'd use MjSip to setup the session and warn users about calls, and Red5 to stream the video and audio? Any suggestions? Note: only 1-on-1 conference is required.

    Read the article

  • getAudioInputStream can not convert [stereo, 4 bytes/frame] stream to [mono, 2 bytes/frame]

    - by brian_d
    Hello. I am using javasound and have an AudioInputStream of format PCM_SIGNED 8000.0 Hz, 16 bit, stereo, 4 bytes/frame, little-endian Using AudioSystem.getAudioInputStream(target_format, original_stream) produces an 'IllegalArgumentException: Unsupported Conversion' when the target_format is PCM_SIGNED 8000.0 Hz, 16 bit, mono, 2 bytes/frame, little-endian Is it possible to convert this stream manually after every read() call? And if yes, how? In general, how can you compare two formats and tell if a conversion is possible?

    Read the article

  • High-Performance In-Browser Networking

    - by Jon Purdy
    (Similar in spirit to but different in practice from this question.) Is there any cross-browser-compatible, in-browser technology that allows a high-performance perstistent network connection between a server application and a client written in, say, Javascript? Think XmlHttpRequest on caffeine. I am working on a visualisation system that's restricted to at most a few users at once, and the server is pretty robust, so it can handle as much as it needs to. I would like to allow the client to have access to video streamed from the server at a minimum of about 20 frames per second, regardless of what their graphics hardware capabilities are. Simply put: is this doable without resorting to Flash or Java?

    Read the article

  • Problems with Acitivity LifeCycle with VideoView playback.

    - by Alex Volovoy
    Hi all i've ran into another problems with VideoView. Then video is playing, and i put device asleep, using hard button, onPause is called. But it followed by 03-17 11:26:33.779: WARN/ActivityManager(884): Activity pause timeout for HistoryRecord{4359f620 com.package/com.package.VideoViewActivity} And then i have onStart/onResume again and Video starts playing. I've try to move code around onStart/onStop - doesn't seems to make difference. sample code : public class VideoViewActivity extends Activity { private String path = ""; private VideoView mVideoView; private static final String MEDIA_URL = "media_url"; @Override public void onCreate(Bundle icicle) { super.onCreate(icicle); setContentView(R.layout.videoview); mVideoView = (VideoView)findViewById(R.id.surface_view); path = getIntent().getStringExtra(MEDIA_URL); } @Override public void onResume() { super.onResume(); mVideoView.setVideoPath(path); mVideoView.setMediaController(new MediaController(this)); mVideoView.requestFocus(); mVideoView.start(); } @Override public void onPause() { super.onPause(); mVideoView.stopPlayback(); mVideoView.setMediaController(null); } }

    Read the article

  • ShoutCast over SSL

    - by Honus Wagner
    So I've gone ahead and set up my ShoutCast server DNAS and set my DSP in Winamp on my host computer. The server listens on port 8000, so per some instructions I installed an output plugin for winamp (Shoutcast DSP) and used 8000 and the password to connect. Server accepts the connection. Now, what the heck do I do now? My host computer is SSL secured and the DNAS server is installed within the secure web directory (if that matters). My desired end result is that I want to listen to my ShoutCast setup at home (host computer) from any computer. I try browsing to my ip address and port 8000 (without using HTTPS) and it comes back with nothing. If I browse with HTTPS://my.server.com:8000, I get Error code: ssl_error_rx_record_too_long) Have I completely missed something, or am I just a total moron? Thanks.

    Read the article

  • Web P2P video confrence solution

    - by dtroy
    I'm looking for the best possible solution which will allow me to incorporate live video/audio conference between 2 users(only 2 at this point) into a flash gaming platform. The video chat is not just an extra feature, it's the main one. I'm mainly looking at open source implementations or something I'll be able to implement myself, but will consider commercial products if they are exactly what I need. Here are a few things I've looked at, but so far, I didn't find any of them good enough: Flash player 10's P2P capabilities sound promising, but I am aware of the fact that Adobe has not release any information on the RTMFP protocol and that there is no commercial server which supports it at this point. Stream all the video/audio live through a flash server (not p2p), but from my personal experience you don't get a smooth conversation. I think TokBox uses this method Java applets are a possible solution too (to perform p2p), but I don't think it will be a nice and elegant solution to combine them in the game at this point (and requires the user to authorize them). BTW, I couldn't find any useful implementations. So, If you know of any, i'll look into them. Google Gmail Video Chat uses a custom (and proprietary) browser plug-in which does the p2p and streams the video/audio into the flash player. This is a possible solution, but I rather not implement the entire p2p protocol stack + browser plug-in at this stage and concentrate on other aspect of the game itself. I think they are using XMPP based protocol similar to Jingle and they've release a Jingle librarby but without the video confrencing implementation. EDIT: In response to Branden: I am aware of Adobe Stratus. Stratus is a beta, hosted rendezvous service that aids establishing communications between Flash Player endpoints (RTMFP server). This current release of the Stratus is prerelease and is designed for evaluation purposes only. The service is not final. There is no guarantee that the service will continue to exist in the future or any information about the future cost. That's why I don't think it can be used as a commercial solution. At least not yet. I'd appreciate your suggestions and advice. thanks!

    Read the article

< Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >