Search Results

Search found 7288 results on 292 pages for 'streaming media'.

Page 70/292 | < Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >

  • Loading a page into memory in Rails

    - by titaniumdecoy
    My rails app produces XML when I load /reports/generate_report.xml. On a separate page, I want to read this XML into a variable and save it to the database. How can I do this? Can I somehow stream the response from the /reports/generate_report.xml URI into a variable? Or is there a better way to do it since the XML is produced by the same web app?

    Read the article

  • Is there any live video stream editing open source project for my needs?

    - by Ole Jak
    I need an open source project with an API capable of reading a live video stream (stream codec can be any API can read - I can provide with practically any live streamable one) giving me last image data for some processing (like brightness\contrast or more exotic filtering) being able to receive data I've changed and starting to stream that data on to some http://localhost:port/ in some format I need it to be easily accessible from C# (even better, written in C#).

    Read the article

  • getting started with libmms

    - by Vnuce
    Actually, the title explains it all... I want to read a stream, but have no idea from where to start. I've searched the web for some documentation/tutorial/whatever with no luck. Any help using this lib would be very appreciated.

    Read the article

  • How could I send live video stream to remote server from my phone !!!

    - by poc
    Hello , I have a problem about streaming my video to server in real-time from my phone. that is , let my phone be a IP Camera , and server can watch the live video from my phone I have googled many many solutions, but there is no one can solve my problem. I use MediaRecorder to record . it can save video file in the SD card correctly. then , I refered this page and used some method as followings skt = new Socket(InetAddress.getByName(hostname),port); pfd =ParcelFileDescriptor.fromSocket(skt); mediaRecorder.setOutputFile(pfd.getFileDescriptor()); now it seems I can send the video stream while recording however, I wrote a receiver-side program to receive the video stream from Android , but it doesn't work . is there any error? I can receive file , but I can not open the video file . I guess the problem may caused by file format ? there are outline of my code. in android side Socket skt = new Socket(hostIP,port); ParcelFileDescriptor pfd =ParcelFileDescriptor.fromSocket(skt); .... .... mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); mediaRecorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT); mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4); mediaRecorder.setOutputFile(pfd.getFileDescriptor()); ..... mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT); mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP); ..... mediaRecorder.start(); in receiver side (my ACER notebook) // anyway , I don't think the file extentions will do any effect File video = new File (strDate+".3gpp"); FileOutputStream fos; try { fos = new FileOutputStream(video); byte[] data = new byte[1024]; int count =-1; while( (count = fin.read(data,0,1024) ) !=-1) { fos.write(data,0,count); fos.flush(); } fos.close(); fin.close(); I confused a long time.... thanks in advance

    Read the article

  • JAVA - Download PDF file from Webserver

    - by Augusto Picciani
    I need to download a pdf file from a webserver to my pc and save it locally. I used Httpclient to connect to webserver and get the content body: HttpEntity entity=response.getEntity(); InputStream in=entity.getContent(); String stream = CharStreams.toString(new InputStreamReader(in)); int size=stream.length(); System.out.println("stringa html page LENGTH:"+stream.length()); System.out.println(stream); SaveToFile(stream); Then i save content in a file: //check CRLF (i don't know if i need to to this) String[] fix=stream.split("\r\n"); File file=new File("C:\\Users\\augusto\\Desktop\\progetti web\\test\\test2.pdf"); PrintWriter out = new PrintWriter(new FileWriter(file)); for (int i = 0; i < fix.length; i++) { out.print(fix[i]); out.print("\n"); } out.close(); I also tried to save a String content to file directly: OutputStream out=new FileOutputStream("pathPdfFile"); out.write(stream.getBytes()); out.close(); But the result is always the same: I can open pdf file but i can see white pages only. Does the mistake is around pdf stream and endstream charset encoding? Does pdf content between stream and endStream need to be manipulate in some others way?

    Read the article

  • Play Shoutcast MP3 radio stream with Python?

    - by Zachary Brown
    I have managed to create an online radio station using Shoutcast and Sam Broadcaster. Now, I am wanting to build my own player for that radio station. I am not sure where to begin, I have googled, but no luck. I am using Python 2.6 on Microsoft Windows. I have managed to capture the stream and save it as an MP# on the hard disk, just not sure what to do with it next. I tried playback of the file, but it always pulls up errors. This is the code I have so far: import urllib target = open("broadcast.mp3") conn = urllib.urlopen("http://78.159.104.175:80") while True: target.write(con.read(5200)) Any help would be greatly appreciated!

    Read the article

  • Is there any live video stream editing opensource project for my needs?

    - by Ole Jak
    So I need Some open source project with API capable of reading live video stream (stream codec can be any API can read - I can provide with practically any live streamable one) giving me last image data for some processing (like brightness\contrast or more exotic filtering) Being able to recieve data I've changed and starting to stream that data on to some http://localhost:port/ in some format I need it to be easily acsessible from C# (better written on it)

    Read the article

  • how to do p2p with flash? [closed]

    - by Female Gay
    Possible Duplicates: how to do p2p with flash? Does Flash10 + p2p really work? It's not difficult to tell what is being asked here. This question is not ambiguous, not vague, not incomplete, either not rhetorical and can be reasonably answered in its current form. So, good luck!

    Read the article

  • What is the most efficient way to store a mapping "key -> event stream"?

    - by jkff
    Suppose there are ~10,000's of keys, where each key corresponds to a stream of events. I'd like to support the following operations: push(key, timestamp, event) - pushes event to the event queue for key, marked with the given timestamp. It is guaranteed that event timestamps for a particular key are pushed in sorted or almost sorted order. tail(key, timestamp) - get all events for key since the given timestamp. Usually the timestamp requests for a given key are almost monotonically increasing, almost synchronously with pushes for the same key. This stuff has to be persistent (although it is not absolutely necessary to persist pushes immediately and to keep tails with pushes strictly in sync), so I'm going to use some kind of database. What is the optimal kind of database structure for this task? Would it be better to use a relational database, a key-value storage, or something else?

    Read the article

  • Java vs Flash for webcam access

    - by Alfredo Palhares
    I will make a video chat website, but coming from PHP and Python for the web i have no experience with video steaming. What do you recommend? Java or Flash? What's more flexible ? I am thinking of even making a C++ server application for stream controlling with a PHP fronted. Since is going to be a high traffic website and performance is a must. Can you point to some direction? Any documentation? Framework?

    Read the article

  • Record the timestamps of slide changes during a live Powerpoint presentation?

    - by StackedCrooked
    I am planning to implement a lecture capture solution. One of the requirements is to record both the presenter and the slideshow. The presenter is recorded with a videocamera obviously, and the slideshow will probably be captured using a tool like Camtasia. Now during playback three components are visible: the presenter, the slides and a table of contents. Clicking a chapter title in the TOC causes the video to navigate to the corresponding section. This means that a mapping must be made between chapter titles and their timestamps in the video. Usually a change of topic is accompanied with a slide change in the Powerpoint presentation. So the timestamps could be deduced from the slidechanges. However, this requires me to detect slide changes during the live presentation. And I don't know how to do that. Anyone here knows how to do detect slide changes? Is there a Powerpoint API where I can connect event handlers or something like that? I'd greatly appreciate your help! Edit This issue is no longer relevant for my current work so this question will not be updated by me. However, you are still free to help others by posting your answers/insights here.

    Read the article

  • How to build a Video Broadcaster, which can handle more than 20,000 viewers

    - by Gaurav Srivastava
    I want to broadcast Video from a web cam, over internet. The problem is, the Video will be viewed live by more than 20,000 people (expected). I have a very little experience with Red5 Broadcasting. I did some broadcasting using Red5 and Flash. It works fine for 1 or 2 viewers i.e. it is great for personal chatting/ video conferencing applications. But, when the number of viewers increases, the delay in Broadcasting also increases. I am experiencing a Delay addition of about 0.5 Seconds for every new user who joins the broadcast. Can any one suggest me some, better technologies on which I can work out this Live Broadcasting. I don't want to use http://www.ustream.com; I want to create one of my own, such tool. But thats always the last solution.

    Read the article

  • Adding a red5 app in a multiuser website

    - by Zakaria
    hi everybody, I have an mvc php website where users can publish their public information: http://www.example.com/foobar/profile. Beside this project, based on some red5 samples, I have an application (done with Flex) that sends audio: rtmp://server/sendAudio (very basic but works). I want to create for each subscribed on my website an admin part where can send an audio stream: http://admin.example.com/foobar. And, when someone goes on their public profile, they can listen to the streamed audio: http://www.example.com/foobar/profile). How can I use my red5/flash app dynamically with my php website so that my users can broadcast their proper canal? Do you have some experience to share ? Thank you, Regards.

    Read the article

  • Corba sequence<octet> a lot slower than using a socket

    - by Totonga
    I have a corba releated question. In my Java app I use typedef sequence Data; Now I played around with this Data vector. If I am right with the Corba specification sequence will either be converted to xs:base64Binary or xs:hexBinary. It should be an Opaque type and so it should not use any marshalling. I tried different idl styles: void Get(out Data d); Data Get(); but what I see is that moving the data using Corba is a lot slower than using a socket directly. I am fine with a little overhead but it looks for me like tha data is still marshalled. Do I need to somehow configure my orb to suppress the marshalling or did I miss something.

    Read the article

  • How to stream semi-live audio over internet

    - by Thomas Tempelmann
    I want to write something like Skype, i.e. I have a constant audio stream on one computer and then recompress it in a format that's suitable for a latent internet connection, receive it on the other end and play it. Let's also assume that the internet connection is fairly modern and fast, i.e. DSL or alike, no slow connections over phone and such. The involved computers will also be rather modern (Dual Core Intel CPUs at 2GHz or more). I know how to handle the audio on the machines. What I don't know is how to transmit the audio in an efficient way. The challenges are: I'd like get good audio quality across the line. The stream should be received without drops. The stream may, however, be received with a little delay (a second delay is acceptable). I imagine that the transport software could first determine the average (and max) latency, then start the stream and tell the receiver to wait for that max latency before starting to play the audio. With that, if the latency doesn't get any higher, the entire stream will be playable on the other side without stutter or drops. If, due to unexpected IP latencies or blockages, the stream does get cut off, I want to be able to notice this so that I can take actions (e.g. abort the stream) and eventually start a new transmission. What are my options if I want do use ready-made software for the compression and tranmission? I have no intention to write my own audio compression engine, really. OTOH, I plan to sell the solution in a vertical market, meaning I can afford a few dollars of license fees per copy, but not $100s. I guess the simplest solution would be to just open a TCP stream, send a few packets back and forth to determine their running time (or even use UDP for that), then use the results as the guide for my max latency value, then simply fire the audio data in its raw form (uncompressed 16 bit stereo), along with a timing code over the TCP connection. The receiver reads the data and plays it with the pre-determined delay. That might just work with the type of fast connection I expect. I just wonder if there are better solutions to reach this goal, with better performance (lower latency) and less data (compressed). BTW, I first try to implement this on OS X, but might want to do it on Windows, too, if it proves successful.

    Read the article

  • Stopping/Removing an embedded player.

    - by Rajat
    Hi, I am working on a webpage where i have to include an embedded video.The video is hosted on some other domain.I am able to embed the video and autoplay once the web page is loaded.However i have a requirement where i have to remove the div displaying the video once it is played , and in place of the video i have to now display some text. The problem is i am able to do autoplay by the autostart variable in the embed tag...but how do i know that the video has ended.The hosting company only provides an embed tag and they donot have any player apis to use. One way (or rather a workaround) that i feel is to start a eventlistener in the background and see for the total time of the video and when that time is reached remove the content.But the problem is what if the user pauses the video, then also the div would be deleted. I am new to flash.Are there some standard variables or actions that we can pass as flashvars to a swf file to stop a running player or to know the state of the player (Note we are only getting an embed tag from the video hosting site so we donot own that code and they donot have much documenation to help me out with the code). Thanks for your help.

    Read the article

  • How do you stream M4V video on the web without using Flash?

    - by Alex
    I'm building a web site that needs to stream video and be friendly for handheld devices (especially the iPhone). Some handhelds don't support Flash so I'm avoiding the use of a Flash player. How does Youtube stream its videos so that they play on both desktops and iPhones? I'm looking for a player, or multiple players, which can be somehow activated based on the user's device. Your help and guidance are much appreciated. Thanks.

    Read the article

< Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >