Search Results

Search found 3772 results on 151 pages for 'music streaming'.

Page 48/151 | < Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >

  • Problems with Acitivity LifeCycle with VideoView playback.

    - by Alex Volovoy
    Hi all i've ran into another problems with VideoView. Then video is playing, and i put device asleep, using hard button, onPause is called. But it followed by 03-17 11:26:33.779: WARN/ActivityManager(884): Activity pause timeout for HistoryRecord{4359f620 com.package/com.package.VideoViewActivity} And then i have onStart/onResume again and Video starts playing. I've try to move code around onStart/onStop - doesn't seems to make difference. sample code : public class VideoViewActivity extends Activity { private String path = ""; private VideoView mVideoView; private static final String MEDIA_URL = "media_url"; @Override public void onCreate(Bundle icicle) { super.onCreate(icicle); setContentView(R.layout.videoview); mVideoView = (VideoView)findViewById(R.id.surface_view); path = getIntent().getStringExtra(MEDIA_URL); } @Override public void onResume() { super.onResume(); mVideoView.setVideoPath(path); mVideoView.setMediaController(new MediaController(this)); mVideoView.requestFocus(); mVideoView.start(); } @Override public void onPause() { super.onPause(); mVideoView.stopPlayback(); mVideoView.setMediaController(null); } }

    Read the article

  • ShoutCast over SSL

    - by Honus Wagner
    So I've gone ahead and set up my ShoutCast server DNAS and set my DSP in Winamp on my host computer. The server listens on port 8000, so per some instructions I installed an output plugin for winamp (Shoutcast DSP) and used 8000 and the password to connect. Server accepts the connection. Now, what the heck do I do now? My host computer is SSL secured and the DNAS server is installed within the secure web directory (if that matters). My desired end result is that I want to listen to my ShoutCast setup at home (host computer) from any computer. I try browsing to my ip address and port 8000 (without using HTTPS) and it comes back with nothing. If I browse with HTTPS://my.server.com:8000, I get Error code: ssl_error_rx_record_too_long) Have I completely missed something, or am I just a total moron? Thanks.

    Read the article

  • Web P2P video confrence solution

    - by dtroy
    I'm looking for the best possible solution which will allow me to incorporate live video/audio conference between 2 users(only 2 at this point) into a flash gaming platform. The video chat is not just an extra feature, it's the main one. I'm mainly looking at open source implementations or something I'll be able to implement myself, but will consider commercial products if they are exactly what I need. Here are a few things I've looked at, but so far, I didn't find any of them good enough: Flash player 10's P2P capabilities sound promising, but I am aware of the fact that Adobe has not release any information on the RTMFP protocol and that there is no commercial server which supports it at this point. Stream all the video/audio live through a flash server (not p2p), but from my personal experience you don't get a smooth conversation. I think TokBox uses this method Java applets are a possible solution too (to perform p2p), but I don't think it will be a nice and elegant solution to combine them in the game at this point (and requires the user to authorize them). BTW, I couldn't find any useful implementations. So, If you know of any, i'll look into them. Google Gmail Video Chat uses a custom (and proprietary) browser plug-in which does the p2p and streams the video/audio into the flash player. This is a possible solution, but I rather not implement the entire p2p protocol stack + browser plug-in at this stage and concentrate on other aspect of the game itself. I think they are using XMPP based protocol similar to Jingle and they've release a Jingle librarby but without the video confrencing implementation. EDIT: In response to Branden: I am aware of Adobe Stratus. Stratus is a beta, hosted rendezvous service that aids establishing communications between Flash Player endpoints (RTMFP server). This current release of the Stratus is prerelease and is designed for evaluation purposes only. The service is not final. There is no guarantee that the service will continue to exist in the future or any information about the future cost. That's why I don't think it can be used as a commercial solution. At least not yet. I'd appreciate your suggestions and advice. thanks!

    Read the article

  • How to play .3gp videos in mobile using RTMP (FMS) and HTTP?

    - by Sunil Kumar
    Hi I am not able to play video on mobile device which is .3gp container and H.263 / AMR_NB encoded. I just want to play my website videos in mobile device also just like youtube.com. I want to use RTMP and HTTP both. My requirement is as follows- Which codec and container will be best? Should I use FLV to play video on mobile device? RTSP required or can be use RTMP? Is NetStream and NetConnection methods different from Flash Player in Flash Lite Player? How to play 3gp video using RTMP stream ie. ns.play(“mp4:mobilevideo.3gp”, 0, -1, true) is it ok or any thing else required? For mobile browser and computer browser, can I use single player or I have to make different player for computer browser and mobile browser? It would be better if I can do it with single player for both mobile and computer browser. Sample code required for testing. If you can. I got below article in which they mention that we can play video 3gp container in mobile also. Please find the article. Articles URL- http://www.hsharma.com/tech/articles/flash-lite-30-video-formats-and-video-volume/ http://www.adobe.com/devnet/logged_in/dmotamedi_fms3.html Thanks Sunil Kumar

    Read the article

  • Mobile Video Detection

    - by aaroninfidel
    Hi, I'm using DeviceAtlas to detect mobile phones, I was wondering if anyone had some good resources in terms of standard codecs, video dimensions that are used and how you go about serving video to mobile devices. Thanks! -Aaron

    Read the article

  • video calling (center)

    - by rrejc
    We are starting to develop a new application and I'm searching for information/tips/guides on application architecture. Application should: read the data from an external (USB) device send the data to the remote server (through internet) receive the data from the remote server perform a video call with to the calling (support) center receive a video call call from the calling (support) center support touch screens In addition: some of the data should also be visible through the web page. So I was thinking about: On the server side: use the database (probably MS SQL) use ORM (nHibernate) to map the data from the DB to the domain objects create a layer with business logic in C# create a web (WCF) services (for client application) create an asp.net mvc application (for item 7.) to enable data view through the browser On the client side I would use WPF 4 application which will communicate with external device and the wcf services on the server. So far so good. Now the problem begins. I have no idea how to create a video call (outgoing or incoming) part of the application. I believe that there is no problem to communicate with microphone, speaker, camera with WPF/C#. But how to communicate with the call center? What protocol and encoding should be used? I think that I will need to create some kind of server which will: have a list of operators in the calling center and track which operator is occupied and which operator is free have a list of connected end users receive incoming calls from end users and delegate call to free operator delegate calls from calling center to the end user Any info, link, anything on where to start would be much appreciated. Many thanks!

    Read the article

  • Understanding PTS and DTS in video frames

    - by theateist
    I had fps issues when transcoding from avi to mp4(x264). Eventually the problem was in PTS and DTS values, so lines 12-15 where added before av_interleaved_write_frame function: 1. AVFormatContext* outContainer = NULL; 2. avformat_alloc_output_context2(&outContainer, NULL, "mp4", "c:\\test.mp4"; 3. AVCodec *encoder = avcodec_find_encoder(AV_CODEC_ID_H264); 4. AVStream *outStream = avformat_new_stream(outContainer, encoder); 5. // outStream->codec initiation 6. // ... 7. avformat_write_header(outContainer, NULL); 8. // reading and decoding packet 9. // ... 10. avcodec_encode_video2(outStream->codec, &encodedPacket, decodedFrame, &got_frame) 11. 12. if (encodedPacket.pts != AV_NOPTS_VALUE) 13. encodedPacket.pts = av_rescale_q(encodedPacket.pts, outStream->codec->time_base, outStream->time_base); 14. if (encodedPacket.dts != AV_NOPTS_VALUE) 15. encodedPacket.dts = av_rescale_q(encodedPacket.dts, outStream->codec->time_base, outStream->time_base); 16. 17. av_interleaved_write_frame(outContainer, &encodedPacket) After reading many posts I still do not understand: outStream->codec->time_base = 1/25 and outStream->time_base = 1/12800. The 1st one was set by me but I cannot figure out why and who set 12800? I noticed that before line (7) outStream->time_base = 1/90000 and right after it it changes to 1/12800, why? When I transcode from avi to avi, meaning changing the line (2) to avformat_alloc_output_context2(&outContainer, NULL, "avi", "c:\\test.avi"; , so before and after line (7) outStream->time_base remains always 1/25 and not like in mp4 case, why? What is the difference between time_base of outStream->codec and outStream? To calc the pts av_rescale_q does: takes 2 time_base, multiplies their fractions in cross and then compute the pts. Why it does this in this way? As I debugged, the encodedPacket.pts has value incremental by 1, so why changing it if it does has value? At the beginning the dts value is -2 and after each rescaling it still has negative number, but despite this the video played correctly! Shouldn't it be positive?

    Read the article

  • Dimdim Change name

    - by islam
    i build dimdim v4.5 on my pc and its work fine with me. each time i want to start meeting i type my pc IP address like this : http://<my-ip-address>/dimdim i want to change the word dimdim to be anything else like : http://<my-ip-address>/meeting regards

    Read the article

  • Steaming a non-PCM WAV file to a SilverLight application

    - by Satumba
    Hi, I would like to allow users to play recorded WAV files that stored on a server back to a Silverlight application as a client to play them. I saw that there is a way to play a WAV file on Silverlight (here), but when i tried to impliment it, i got an error playing the file because it is not in PCM format but encoded. The files that i'm trying to play are encoded with a special encoder, so i thought that the only way is to decode the WAV file on the server and stream it back to the client. The limitation is that the decode process should occur in real time because it is not reasonable to convert all the WAV files that exists. Is it possible to do it? Which streamer can i use? (Windows Media Service can help here?) Does somebody has any experience with such a scenario? Appreciate your help.

    Read the article

  • Blackberry buffered playback demo??

    - by Bohemian
    Can someone help me to buffer a mp3 file on a server using the Blackberry buffered pllayback demo app provided with the jde? I hav loaded it in the simulator. And my mds is started but I m unable to play the audio. There is no error but it doesnt play/load. The code looks all fine. Thanks

    Read the article

  • Wireshark doesnt' recognises RTMP streams

    - by Andrew
    Hello! I found on the web few samples on tracking RTMP (Real Time Messaging Protocol) with Wireshark, but it doesn't work for me. All RTMPT packets rendered as basic TCP packet like this: 149 14.324999 85.115.xxx.xxx 192.168.1.20 TCP macromedia-fcs > 54557 [ACK] Seq=1 Ack=1452 Win=69 Len=0 I'm using Wireshark 1.2.8 with all protocols installed on Windows Vista. What can i do to fix it? Thx!

    Read the article

  • Creating a customized video using Flash and XML

    - by Aaron Ladage
    The problem: I have to create a Flash video (in CS3) that will query a MySQL database and display that data at certain points in the video. The bigger problem: I'm not a Flash/ActionScript developer, so this is all very foreign to me! I've divided this project into two parts: a.) dynamically generate an XML feed from the data using PHP (using an ID number passed in the URL's query string), and b.) be able to work with it in Flash. I've got the first part working, but am pretty lost in Flash. I can parse the XML, but I'm not sure how to set the data up as variables and attach it to a video's cue points. Can anyone point me in the direction of a good tutorial or offer some advice?

    Read the article

  • Stream (.NET) handling best-practices

    - by Jader Dias
    The question is entitled with the word "Stream" because the question below is a concrete example of a more generic doubt I have about Streams: I have a problem that accepts two solutions and I want to know the best one: I download a file, save it to disk (2 min), read it and write the contents to the DB (+ 2 min). I download a file and write the contents directly to the DB (3 min). If the write to DB fails I'll have to download again in the second case, but not in the first case. Which is best? Which would you use?

    Read the article

  • capturing video from ip camera

    - by Ruby
    I am trying to capture video from ip camera into my application , its giving exception com.sun.image.codec.jpeg.ImageFormatException: Not a JPEG file: starts with 0x0d 0x0a at sun.awt.image.codec.JPEGImageDecoderImpl.readJPEGStream(Native Method) at sun.awt.image.codec.JPEGImageDecoderImpl.decodeAsBufferedImage(Unknown Source) at test.AxisCamera1.readJPG(AxisCamera1.java:130) at test.AxisCamera1.readMJPGStream(AxisCamera1.java:121) at test.AxisCamera1.readStream(AxisCamera1.java:100) at test.AxisCamera1.run(AxisCamera1.java:171) at java.lang.Thread.run(Unknown Source) its giving exception at image = decoder.decodeAsBufferedImage(); Here is the code i am trying private static final long serialVersionUID = 1L; public boolean useMJPGStream = true; public String jpgURL = "http://ip here/video.cgi/jpg/image.cgi?resolution=640×480"; public String mjpgURL = "http://ip here /video.cgi/mjpg/video.cgi?resolution=640×480"; DataInputStream dis; private BufferedImage image = null; public Dimension imageSize = null; public boolean connected = false; private boolean initCompleted = false; HttpURLConnection huc = null; Component parent; /** Creates a new instance of AxisCamera */ public AxisCamera1(Component parent_) { parent = parent_; } public void connect() { try { URL u = new URL(useMJPGStream ? mjpgURL : jpgURL); huc = (HttpURLConnection) u.openConnection(); // System.out.println(huc.getContentType()); InputStream is = huc.getInputStream(); connected = true; BufferedInputStream bis = new BufferedInputStream(is); dis = new DataInputStream(bis); if (!initCompleted) initDisplay(); } catch (IOException e) { // incase no connection exists wait and try // again, instead of printing the error try { huc.disconnect(); Thread.sleep(60); } catch (InterruptedException ie) { huc.disconnect(); connect(); } connect(); } catch (Exception e) { ; } } public void initDisplay() { // setup the display if (useMJPGStream) readMJPGStream(); else { readJPG(); disconnect(); } imageSize = new Dimension(image.getWidth(this), image.getHeight(this)); setPreferredSize(imageSize); parent.setSize(imageSize); parent.validate(); initCompleted = true; } public void disconnect() { try { if (connected) { dis.close(); connected = false; } } catch (Exception e) { ; } } public void paint(Graphics g) { // used to set the image on the panel if (image != null) g.drawImage(image, 0, 0, this); } public void readStream() { // the basic method to continuously read the // stream try { if (useMJPGStream) { while (true) { readMJPGStream(); parent.repaint(); } } else { while (true) { connect(); readJPG(); parent.repaint(); disconnect(); } } } catch (Exception e) { ; } } public void readMJPGStream() { // preprocess the mjpg stream to remove the // mjpg encapsulation readLine(3, dis); // discard the first 3 lines readJPG(); readLine(2, dis); // discard the last two lines } public void readJPG() { // read the embedded jpeg image try { JPEGImageDecoder decoder = JPEGCodec.createJPEGDecoder(dis); image = decoder.decodeAsBufferedImage(); } catch (Exception e) { e.printStackTrace(); disconnect(); } } public void readLine(int n, DataInputStream dis) { // used to strip out the // header lines for (int i = 0; i < n; i++) { readLine(dis); } } public void readLine(DataInputStream dis) { try { boolean end = false; String lineEnd = "\n"; // assumes that the end of the line is marked // with this byte[] lineEndBytes = lineEnd.getBytes(); byte[] byteBuf = new byte[lineEndBytes.length]; while (!end) { dis.read(byteBuf, 0, lineEndBytes.length); String t = new String(byteBuf); System.out.print(t); // uncomment if you want to see what the // lines actually look like if (t.equals(lineEnd)) end = true; } } catch (Exception e) { e.printStackTrace(); } } public void run() { System.out.println("in Run..................."); connect(); readStream(); } @SuppressWarnings("deprecation") public static void main(String[] args) { JFrame jframe = new JFrame(); jframe.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); AxisCamera1 axPanel = new AxisCamera1(jframe); new Thread(axPanel).start(); jframe.getContentPane().add(axPanel); jframe.pack(); jframe.show(); } } Any suggestions what I am doing wrong here??

    Read the article

  • Trying to build automatic audio-conferencing capability into a WebApp

    - by Keller
    Hey all, I'm working with a team of relatively novice programmers, and we are trying to create a site that will have audio-conferencing capabilities such that whenever someone visits the page, they will immediately have audio-conferencing capabilities with everyone else on the page (5 people max). Can anyone point us in a general direction? Should we be looking into building a custom app, leveraging audio conferencing software, or trying to mimic a webex program? Would Adobe Stratus be useful in getting this kind of functionality? Does anyone have any ideas about how we would design something like this on a macro level? Sorry for the noobish question, but any guidance would be deeply appreciated. Thanks, Keller

    Read the article

  • Recording Audio through RTMP/Rails

    - by Lowgain
    I am in the process of building a rails/flex application which requires audio to be recorded and then stored in our amazon s3 account. I have found no alternative to using some form of RTMP server for recording audio through flash, but our hosting environment will not allow us to install anything like FMS, Red5, etc. Is there any existing Ruby/Rails RTMP solution that will allow audio recording? If not, is it possible for Rails to at least intercept the RTMP stream and then I can hope to reference red5 or something for parsing the data (long shot, I know)? The other alternative I can think of is hosting a red5 server on another host and communicating with our rails app once the saving/uploading is done, which is not preferred. Am I going to have any luck here?

    Read the article

  • help me pick the right iPhone audio class - MPMoviePlayer vs AVAudioPlayer vs MPMusicPlayer

    - by huevos de oro
    Does anyone know of a good tutorial on the distinction between the MPMoviePlayer vs AVAudioPlayer vs MPMusicPlayer? I want to play audio from an mp3 file available at an external URL. Ideally it is played in an iPod-like audio view. I toyed with MPMoviePlayer but it appears to be more suitable for video, as when audio starts a "movie playing" message displays, the controls disappear and a white quicktime splash page displays. I would like the standard ipod audio controls to display all the time, and to customize the image behind them.

    Read the article

  • XMLStreamReader and a real stream

    - by Yuri Ushakov
    Update There is no ready XML parser in Java community which can do NIO and XML parsing. This is the closest I found, and it's incomplete: http://wiki.fasterxml.com/AaltoHome I have the following code: InputStream input = ...; XMLInputFactory xmlInputFactory = XMLInputFactory.newInstance(); XMLStreamReader streamReader = xmlInputFactory.createXMLStreamReader(input, "UTF-8"); Question is, why does the method #createXMLStreamReader() expects to have an entire XML document in the input stream? Why is it called a "stream reader", if it can't seem to process a portion of XML data? For example, if I feed: <root> <child> to it, it would tell me I'm missing the closing tags. Even before I begin iterating the stream reader itself. I suspect that I just don't know how to use a XMLStreamReader properly. I should be able to supply it with data by pieces, right? I need it because I'm processing a XML stream coming in from network socket, and don't want to load the whole source text into memory. Thank you for help, Yuri.

    Read the article

  • Pros and cons of MPMoviePlayerController versus launching UIWebView to stream movie

    - by Nosredna
    I have a client who has video content for the web in Flash format. My task is to help them show the videos in an iPhone app. I realize that step one is to get these videos into the appropriate Quicktime format for the iPhone. Then I'm going to have to help the client figure out how or where to host these files. If that's tricky I assume they can be hosted at YouTube. My chief concern, though, is which approach to take to stream the video. What are the pros and cons of MPMoviePlayerController versus launching UIWebView with the URL of the stream? Is there any difference? Is one of them more or less forgiving? Is one of them a better user experience? Any gotchas I might expect to run into? I'm assuming playing video is pretty easy on the iPhone. Is it reasonable to try both and have one available as a fallback, or would that be a waste of time? I'm trying to schedule this out a bit, so I'd love to hear real-world experiences from anyone who's done this.

    Read the article

  • Collecting high-volume video viewing data

    - by DanK
    I want to add tracking to our Flash-based media player so that we can provide analytics that show what sections of videos are being watched (at the moment, we just register a view when a video starts playing) For example, if a viewer watches the first 30 seconds of a video and then clicks away to something else, we want the data to reflect that. Likewise, if someone watches the first 10 seconds, then scrubs the timeline to the last minute of the video and watches that, we want to register viewing on the parts watched and not the middle section. My first thought was to collect up the viewing data in the player and send it all to the server at the end of a viewing session. Unfortunately, Flash does not seem to have an event that you can hook into when a viewer clicks away from the page the movie is on (probably a good thing - it would be open to abuse) So, it looks like we're going to have to make regular requests to the server as the video is playing. This is obviously going to lead to a high volume of requests when there are large numbers of simultaneous viewers. The simple approach of dumping all these 'heartbeat' events from clients to a database feels like it will quickly become unmanageable so I'm wondering whether I should be taking an approach where viewing sessions are cached in memory and flushed to database when they become inactive (based on a timeout). That way, the data could be stored as time spans rather than individual heartbeats. So, to the question - what is the best way to approach dealing with this kind of high-volume viewing data? Are there any good existing architectures/patterns? Thanks, Dan.

    Read the article

  • JWPlayer plays videos at 3x speed on the first run, then works fine in subsequent runs on the same p

    - by Josiah Kiehl
    Go here: http://nano.materials.drexel.edu/research/videolibrary For some bizarre reason, the videos will play at 3x speed on the first run through, but then will play at normal speed each subsequent play. This doesn't happen all the time, and it's not always the same video(s) that do it. I'm utterly baffled. I've reconverted the videos from m4p to flv (using BitComet's converter) several times, double checking the settings each time through with no change to the behavior. Anyone have a clue what's going on?

    Read the article

  • Is there any live video stream editing open source project with API for my needs?

    - by Ole Jak
    I need an open source project with an API capable of reading a live video stream (stream codec can be any API can read - I can provide with practically any live streamable one) giving me last image data for some processing (like brightness\contrast or more exotic filtering) being able to receive data I've changed and starting to stream that data on to some http://localhost:port/ in some format I need it to be easily accessible from C# (even better, written in C#).

    Read the article

< Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >