Search Results

Search found 7178 results on 288 pages for 'audio playing'.

Page 52/288 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • Playing Ogg Sound in Android

    - by baba tenor
    In my application, I am trying to play a sound file in ogg format, stored in raw folder in res directory of my application. When I press the button that calls below function, it just freezes with the button pressed and does not respond. In the end, I have to terminate the application from Eclipse. Nothing about an error or exception in Logcat. In debugging mode, it enters create function and never comes back. What am I doing wrong? private void playbeep() { mPlayer = MediaPlayer.create(this, R.raw.beep); mPlayer.start(); mPlayer.release(); }

    Read the article

  • Playing craps, asking and printing

    - by Angelo Mejia
    How do I ask the amount of games of craps someone wants to play and print the number of wins as a result of n number of games? Also How do I make a table in the main method using the previous methods I have? A table like this but shows the results: Percent Wins Games Experiment Number 1 2 3 4 5 6 7 8 9 10 100 1000 10000 100000 1000000 public class CrapsAnalysis { public static int rollDie( int n) { return (int)(Math.random()*n) + 1 ; } public static int rollDice( ) { int die1 ; int die2 ; die1 = rollDie(6) ; die2 = rollDie(6) ; return die1 + die2 ; } public static boolean playOneGame( ) { int newDice ; //repeated dice rolls int roll ; //first roll of the dice int playerPoint = 0 ; //player point if no win or loss on first roll newDice = rollDice() ; roll = rollDice() ; if (roll == 7 || roll == 11) return true; else if (roll == 2 || roll == 3 || roll == 12) return false; else { playerPoint = roll; newDice = rollDice(); do { newDice = rollDice(); } while (newDice != playerPoint || newDice != 7) ; if (newDice == 7) return false; else return true; } } public static int playGames ( int n ) { int numWon = 0; for (int i = 1; i <= n; i++) if(playOneGame()) numWon++; return numWon; } public static void main(String[] args) { //Don't know how to ask and print out the results of the number of wins depending on the n number of games played } }

    Read the article

  • Playing with Strings

    - by HARSHITH
    Given a string and a non-empty word string, return a string made of each char just before and just after every appearance of the word in the string. Ignore cases where there is no char before or after the word, and a char may be included twice if it is between two words. wordEnds("abcXY123XYijk", "XY") ? "c13i" wordEnds("XY123XY", "XY") ? "13" wordEnds("XY1XY", "XY") ? "11"

    Read the article

  • ie8 playing funny with list-style-position: inside

    - by LeeR
    Ok, So problem here... when using list-style-position:inside in IE8 the first like is indented but every line after that is not. So the new lines appear under the bullet. This is fine, but when I use a list with that css applied with an a tag within the li then the text automatically gets pushed to the second line, and the first line is empty. When I remove the a tag from the li then it jumps back up. Any idea on why this might be or is this a bug in the ie8 world or do I just need to double check my css? Any insights would be much appreciated. As asked here is some code <div id="sub_nav"> <ul> ... <li><a class="active_page" href="#">Liposculpture</a> <ul> <li><a href="#">What is Liposculpture?</a></li> <li><a href="#">About Liposculpture surgery</a></li> <li><a href="#" class="active_sub">After Liposculpture surgery</a></li> <li><a href="#">Post Op Instructions</a></li> <li><a href="#">Liposculpture Side Effects</a></li> <li><a href="#">Liposuction Introduction to</a></li> <li><a href="#">Tumescent Liposculpture</a></li> </ul> </li> ... </ul> </div> For the CSS I will try and show it best I can #sub_nav li { width: 200px; padding:4px 0; border-bottom: 1px #CCC solid; } #sub_nav li a { text-decoration: none; color:#555; padding:7px 15px 7px 15px; display: block; } #sub_nav li ul li { list-style-position: inside; list-style-type: disc; font: 11px Arial; padding-left:15px; color:#FFF; border-bottom: none; } #sub_nav li ul li a { padding:0; margin:0; text-indent: 0; } Hope this helps

    Read the article

  • Playing FLV unexpectedly stops, but why?

    - by Josamoto
    I am trying to play an FLV video that is about 90Mb big, using the following snippet: private function playVideo(url:String):void { var customClient:Object = new Object(); customClient.onMetaData = metaDataHandler; var nc:NetConnection = new NetConnection(); nc.connect(null); var ns:NetStream = new NetStream(nc); ns.client = customClient; ns.play(url); var myVideo:Video = new Video(); myVideo.attachNetStream(ns); addChild(myVideo); function metaDataHandler(infoObject:Object):void { myVideo.width = 640; myVideo.x = stage.stageWidth / 2 - 640 / 2; myVideo.height = 400; myVideo.y = 230; } } The video is not going to be streamed across the internet, in fact, the application will run locally and videos will also be loaded locally from disc. When starting my application, the video starts, but at random locations, playback completely freezes up, without any errors being thrown at all. Does anybody have any idea as to why this might happen?

    Read the article

  • Continuous playing swf on Static Website.

    - by JCHASE11
    Hello. I have a music player swf embedded on an html page. Is there any way to have the music continuously play, even when the different html pages are loaded? When a link is clicked, the page is refreshed, also restarting the swf(music). If the site was AJAX driven, this wouldn't be a problem, but all my pages are static. I suppose I could put the entire body in an iframe, but there has to be a better option. I am certainly open to the idea of using ajax here, but I do not have much ajax experience. Any ideas?

    Read the article

  • playing songs in html using hyperlink

    - by Sachindra
    this is the code i m using to play songs <a href="http://southgreenvillecoc1.org/BeEncouragedByWhatYouKnow4-25-2010am.wma" style=" text-decoration:none; color:#404040;" target="_blank"> “Be Encouraged by What You Know”</a> what do i need to do to play the song??? the link gets redirected to a different server....anything I need to embed??

    Read the article

  • Playing a .wav sound file in JPanel/JFrame using javax (Swing)

    - by JavaIceCream
    I need some code example on how I would use a filepath from a harddrive location to then play a .wav sound file when opened in swing GUI. I don't need it to show a play button, or pause or stop. I just want it to play when I select the 'Sound' option from my 'Files' in my window (I know how to do that already, no need to explain that). So basically, just how to play a .wav sound file from a filepath (i.e. c:/cake/thereisnone.wav) inside of a JFrame. And how can I easily apply methods to that sound file afterwards. Also, if anyone knows how to apply methods on a BufferedImage in a JFrame, that would be helpful too. Thank you very much everyone!

    Read the article

  • C++ online Role Playing Game (RPG)

    - by David
    So I've been learning C++ and SDL to make some basic 2d games. I want to create a game sort of like World of Warcraft but a 2D version. I want it to be on-line and use a database or something to start data like amount of Gold, HP, etc. I was wondering though, if I do this in SDL, would it still work on-line or would the user have to download SDL themselves to play? I just want a game like this but be able to play it with some friends, just for learning purposes you know. I was also looking at DirectX because everyone has that on windows pretty much. Anyways much help is appreciated, thanks!

    Read the article

  • problem in playing mp4 video in safari

    - by Codiator
    I have a particular link to a video file in server. when i load the safari through [[UIApplication sharedApplication] openURL:[NSURL URLWithString:@"http://mydomain/category/2.mp4"]]; simulator pops out an alert message saying that "Cannot Play MOVIE.. Server is not correctly configured"....... Even I tried using the mediaPlayer framework - MPMoviePlayerController and loading it with th e url-"http://mydomain/category/2.mp4" but it didnt play out. when i paste the same url in the 10.5 leopards safari it plays out smoothly.. What may be the reason behind it.?

    Read the article

  • Playing with selects and javascript

    - by Tom
    My code is: <select name='main'> <option>Animals</option> <option>Food</option> <option>Cars</option> </select> <select name='other'> <option>Rats</option> <option>Cats</option> <option>Oranges</option> <option>Audi</option> </select> How do I filter my second select, so it would only show items which I want eg. if I Choose Animals, my select would be: <select name='other'> <option>Rats</option> <option>Cats</option> </select> and if I choose Food, my select would be: <select name='other'> <option>Oranges</option> </select> Well, I hope you get the idea. Thanks.

    Read the article

  • VideoView Not Playing, Error(1,-1)

    - by Jesse J
    I originally thought that the video format was wrong, but after trying .mov, .3gp, and .mp4 with H.264, I'm wondering if something is wrong with my code? public class IntroActivity extends Activity { VideoView videoHolder; /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); } @Override protected void onStart() { getWindow().setFormat(PixelFormat.TRANSLUCENT); videoHolder = new VideoView(this); //videoHolder = (VideoView)findViewById(R.id.myvideoview); Uri video = Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.menu); videoHolder.setVideoURI(video); videoHolder.start(); videoHolder.setOnCompletionListener(new MediaPlayer.OnCompletionListener() { public void onCompletion(MediaPlayer mp) { videoHolder.start(); } }); setContentView(videoHolder); super.onStart(); }

    Read the article

  • How to send audio data from Java Applet to Rails controller

    - by cooldude
    Hi, I have to send the audio data in byte array obtain by recording from java applet at the client side to rails server at the controller in order to save. So, what encoding parameters at the applet side be used and in what form the audio data be converted like String or byte array so that rails correctly recieve data and then I can save that data at the rails in the file. As currently the audio file made by rails controller is not playing. It is the following ERROR : LAVF_header: av_open_input_stream() failed while playing with the mplayer. Here is the Java Code: package networksocket; import java.util.logging.Level; import java.util.logging.Logger; import javax.swing.JApplet; import java.net.*; import java.io.*; import java.awt.event.*; import java.awt.*; import java.sql.*; import javax.swing.*; import javax.swing.border.*; import java.awt.*; import java.util.Properties; import javax.swing.plaf.basic.BasicSplitPaneUI.BasicHorizontalLayoutManager; import sun.awt.HorizBagLayout; import sun.awt.VerticalBagLayout; import sun.misc.BASE64Encoder; /** * * @author mukand */ public class Urlconnection extends JApplet implements ActionListener { /** * Initialization method that will be called after the applet is loaded * into the browser. */ public BufferedInputStream in; public BufferedOutputStream out; public String line; public FileOutputStream file; public int bytesread; public int toread=1024; byte b[]= new byte[toread]; public String f="FINISH"; public String match; public File fileopen; public JTextArea jTextArea; public Button refreshButton; public HttpURLConnection urlConn; public URL url; OutputStreamWriter wr; BufferedReader rd; @Override public void init() { // TODO start asynchronous download of heavy resources //textField= new TextField("START"); //getContentPane().add(textField); JPanel p = new JPanel(); jTextArea= new JTextArea(1500,1500); p.setLayout(new GridLayout(1,1, 1,1)); p.add(new JLabel("Server Details")); p.add(jTextArea); Container content = getContentPane(); content.setLayout(new GridBagLayout()); // Used to center the panel content.add(p); jTextArea.setLineWrap(true); refreshButton = new java.awt.Button("Refresh"); refreshButton.reshape(287,49,71,23); refreshButton.setFont(new Font("Dialog", Font.PLAIN, 12)); refreshButton.addActionListener(this); add(refreshButton); Properties properties = System.getProperties(); properties.put("http.proxyHost", "netmon.iitb.ac.in"); properties.put("http.proxyPort", "80"); } @Override public void actionPerformed(ActionEvent e) { try { url = new URL("http://localhost:3000/audio/audiorecieve"); urlConn = (HttpURLConnection)url.openConnection(); //String login = "mukandagarwal:rammstein$"; //String encodedLogin = new BASE64Encoder().encodeBuffer(login.getBytes()); //urlConn.setRequestProperty("Proxy-Authorization",login); urlConn.setRequestMethod("POST"); // urlConn.setRequestProperty("Content-Type", //"application/octet-stream"); //urlConn.setRequestProperty("Content-Type","audio/mpeg");//"application/x-www- form-urlencoded"); //urlConn.setRequestProperty("Content-Type","application/x-www- form-urlencoded"); //urlConn.setRequestProperty("Content-Length", "" + // Integer.toString(urlParameters.getBytes().length)); urlConn.setRequestProperty("Content-Language", "UTF-8"); urlConn.setDoOutput(true); urlConn.setDoInput(true); byte bread[]=new byte[2048]; int iread; char c; String data=URLEncoder.encode("key1", "UTF-8")+ "="; //String data="key1="; FileInputStream fileread= new FileInputStream("//home//mukand//Hellion.ogg");//Dogs.mp3");//Desktop//mausam1.mp3"); while((iread=fileread.read(bread))!=-1) { //data+=(new String()); /*for(int i=0;i<iread;i++) { //c=(char)bread[i]; System.out.println(bread[i]); }*/ data+= URLEncoder.encode(new String(bread,iread), "UTF-8");//new String(new String(bread));// // data+=new String(bread,iread); } //urlConn.setRequestProperty("Content-Length",Integer.toString(data.getBytes().length)); System.out.println(data); //data+=URLEncoder.encode("mukand", "UTF-8"); //data += "&" + URLEncoder.encode("key2", "UTF-8") + "=" + URLEncoder.encode("value2", "UTF-8"); //data="key1="; wr = new OutputStreamWriter(urlConn.getOutputStream());//urlConn.getOutputStream(); //if((iread=fileread.read(bread))!=-1) // wr.write(bread,0,iread); wr.write(data); wr.flush(); fileread.close(); jTextArea.append("Send"); // Get the response rd = new BufferedReader(new InputStreamReader(urlConn.getInputStream())); while ((line = rd.readLine()) != null) { jTextArea.append(line); } wr.close(); rd.close(); //jTextArea.append("click"); } catch (MalformedURLException ex) { Logger.getLogger(Urlconnection.class.getName()).log(Level.SEVERE, null, ex); } catch (IOException ex) { Logger.getLogger(Urlconnection.class.getName()).log(Level.SEVERE, null, ex); } } @Override public void start() { } @Override public void stop() { } @Override public void destroy() { } // TODO overwrite start(), stop() and destroy() methods } Here is the Rails controller function for recieving: def audiorecieve puts "///////////////////////////////////////******RECIEVED*******////" puts params[:key1]#+" "+params[:key2] data=params[:key1] #request.env('RAW_POST_DATA') file=File.new("audiodata.ogg", 'w') file.write(data) file.flush file.close puts "////**************DONE***********//////////////////////" end Please reply quickly

    Read the article

  • audio onprogress in chrome not working

    - by user351709
    Hi I am having a problem getting onprogress event for the audio tag working on chrome. it seems to work on fire fox. http://www.scottandrew.com/pub/html5audioplayer/ works on chrome but there is no progress bar update. When I copy the code and change the src to a .wav file and run it on fire fox it works perfectly. <style type="text/css"> #content { clear:both; width:60%; } .player_control { float:left; margin-right:5px; height: 20px; } #player { height:22px; } #duration { width:400px; height:15px; border: 2px solid #50b; } #duration_background { width:400px; height:15px; background-color:#ddd; } #duration_bar { width:0px; height:13px; background-color:#bbd; } #loader { width:0px; height:2px; } .style1 { height: 35px; } </style> <script type="text/javascript"> var audio_duration; var audio_player; function pageLoaded() { audio_player = $("#aplayer").get(0); //get the duration audio_duration = audio_player.duration; $('#totalTime').text(formatTimeSeconds(audio_player.duration)); //set the volume } function update(){ //get the duration of the player dur = audio_player.duration; time = audio_player.currentTime; fraction = time/dur; percent = (fraction*100); wrapper = document.getElementById("duration_background"); new_width = wrapper.offsetWidth*fraction; document.getElementById("duration_bar").style.width = new_width + "px"; $('#currentTime').text(formatTimeSeconds(audio_player.currentTime)); $('#totalTime').text(formatTimeSeconds(audio_player.duration)); } function formatTimeSeconds(time) { var minutes = Math.floor(time / 60); var seconds = "0" + (Math.floor(time) - (minutes * 60)).toString(); if (isNaN(minutes) || isNaN(seconds)) { return "0:00"; } var Strseconds = seconds.substr(seconds.length - 2); return minutes + ":" + Strseconds; } function playClicked(element){ //get the state of the player if(audio_player.paused) { audio_player.play(); newdisplay = "||"; }else{ audio_player.pause(); newdisplay = ">"; } $('#totalTime').text(formatTimeSeconds(audio_player.duration)); element.value = newdisplay; } function trackEnded(){ //reset the playControl to 'play' document.getElementById("playControl").value=">"; } function durationClicked(event){ //get the position of the event clientX = event.clientX; left = event.currentTarget.offsetLeft; clickoffset = clientX - left; percent = clickoffset/event.currentTarget.offsetWidth; duration_seek = percent*audio_duration; document.getElementById("aplayer").currentTime=duration_seek; } function Progress(evt){ $('#progress').val(Math.round(evt.loaded / evt.total * 100)); var width = $('#duration_background').css('width') $('#loader').css('width', evt.loaded / evt.total * width.replace("px","")); } function getPosition(name) { var obj = document.getElementById(name); var topValue = 0, leftValue = 0; while (obj) { leftValue += obj.offsetLeft; obj = obj.offsetParent; } finalvalue = leftValue; return finalvalue; } function SetValues() { var xPos = xMousePos; var divPos = getPosition("duration_background"); var divWidth = xPos - divPos; var Totalwidth = $('#duration_background').css('width').replace("px","") audio_player.currentTime = divWidth / Totalwidth * audio_duration; $('#duration_bar').css('width', divWidth); } </script> </head> <script type="text/javascript" src="js/MousePosition.js" ></script> <body onLoad="pageLoaded();"> <table> <tr> <td valign="bottom"><input id="playButton" type="button" onClick="playClicked(this);" value=">"/></td> <td colspan="2" class="style1" valign="bottom"> <div id='player'> <div id="duration" class='player_control' > <div id="duration_background" onClick="SetValues();"> <div id="loader" style="background-color: #00FF00; width: 0px;"></div> <div id="duration_bar" class="duration_bar"></div> </div> </div> </div> </td> </tr> <tr> <td> </td> <td> <span id="currentTime">0:00</span> </td> <td align="right" > <span id="totalTime">0:00</span> </td> </tr> </table> <audio id='aplayer' src='<%=getDownloadLink() %>' type="audio/ogg; codecs=vorbis" onProgress="Progress(event);" onTimeUpdate="update();" onEnded="trackEnded();" > <b>Your browser does not support the <code>audio</code> element. </b> </audio> </body>

    Read the article

  • Android RTSP coding problem

    - by NetApex
    I have Googled my butt off trying to find where if there is a surefire way to make rtsp work. I have a radio station that I listen to that streams via rtsp. Of course by default Android doesn't want to play it. If I pop the URL into yourmuze.fm and create a station there it lets me stream it to my phone. After checking how it works I come to find that it streams to the phone via rtsp! So obviously there is something amiss. What makes one stream work and one not? This is the stream I am attempting : rtsp://wms2.christiannetcast.com/yes-fm It is an audio stream so I would be thrilled with most peoples problem of "it only does audio and not video." When yourmuze.fm streams, DDMS states it brings up MovieView to play the audio if that helps at all.

    Read the article

  • Embedding wav files in AS3 Flash/Flex project?

    - by aaaidan
    The Flash IDE is capable of embedding many types of uncompressed sound files, including wav, and offers optional compression when publishing. However, the [Embed] tag, only seems to allow embedding of mp3 files. Is it truly impossible to embed an uncompressed wav file, or am I missing some magic, undocumented mimeType? I was hoping for something like: [Embed source="../../audio/wibble.wav" mimeType="audio/wav"] ...but I get no transcoder registered for mimeType 'audio/wav' It's possible to embed wav or other format as an octet-stream and parse at runtime, but that's pretty heavy handed I think. I'm surprised that even though the Flash IDE can embed uncompressed sound data, [Embed] cannot, given that the swf spec can contain uncompressed sound data. Any takers?

    Read the article

  • Streaming server required with JW Player?

    - by Aaron
    Currently, a site I developed plays mp3 files directly in JW Player using the file attribute and public URLs to the mp3 file. This is now an issue with the client for legal reasons, and they now need to stream the audio files so that the users can't open up their cache and nab the files directly after downloading. The JW player site has a bunch of examples for streaming video, but nothing for audio. Is it possible to stream audio files with JW player, and do we have to pay a lot of money for a streaming provider? Is it possible to do on the local php server?

    Read the article

  • How to do a sample rate conversion in Windows (and OSX)

    - by Paperflyer
    I am about to write an audio file converter for my side job at the university. As part of this I would need sample rate conversion. However, my professor said that it would be pretty hard to write a sample rate converter that was both of good quality and fast. On my research on the subject, I found some functions in the OSX CoreAudio-framework, that could do a sample rate conversion (AudioConverter.h). After all, an OS has to have some facilities to do that for its own audio stack. Do you know a similar method for C/C++ and Windows, that are either part of the OS or open source? I am pretty sure that this function exists within DirectX Audio (XAudio2?), but I seem to be unable to find a reference to it in the MSDN library.

    Read the article

  • Can anyone give me a sample DSP script in C/C++

    - by Andrew
    Im working on a (Audio) DSP project and just wondering if there are any sample (Open source) DSP example that are written in c or c++, for my MSP430 Chip. I just want something as a guideline so i can program my own script using the ACD and DCA on my board for sampling. http://focus.ti.com/docs/toolsw/folders/print/msp-exp430f5438.html Thats my board, MSP430F5438 Experimenter Board, from what i herd it can run dsp script via the USB connection with the computer. Im using CCS ( From TI, code composer studio) and Octave/Matlab. Just any DSP example scripts or sites that will help me create my own would be appreciated. What im tying to do, Partial audio (sampled) track -- Nyquist rate sampling -- over- and undersampling -- reconstruction of the audio track.

    Read the article

  • Determining the magnitude of a certain frequency on the iPhone

    - by eagle
    I'm wondering what's the easiest/best way to determine the magnitude of a given frequency in a sound. It's my understanding that a FFT function will return the magnitudes of all frequencies in a signal. I'm wondering if there is any shortcut I could use if I'm only concerned about a specific frequency. I'll be using the iPhone mic to record the audio. My guess is that I'll be using the Audio Queue Services for recording since I don't need to record the audio to a file. I'm using SDK 4.0, so I can use any of the functions defined in the Accelerate framework (e.g. FFT functions) if needed. Update: I updated the question to be more clear as per Conrad's suggestion.

    Read the article

  • feature extraction from acoustic signals

    - by Dolphin
    Hi everyone, It's been a while. I found APIs in Java for extracting features from acoustic audio files and symbolic files separately. But now I have a problem in mapping from low level wav audio features to high level midi features. i.e. I need to write the extracted wav audio features on to midi format. But I cannot think of anything even close to it. Can someone pls provide me some insight as in how I can approach this. Greatly appreciate your responses. Advance thanks

    Read the article

  • How to stream a WAV file?

    - by jonasb
    I'm writing an app where I record audio and upload the audio file over the web. In order to speed up the upload I want to start uploading before I've finished recording. The file I'm creating is a WAV file. My plan was to use multiple data chunks. So instead of the normal encoding (RIFF, fmt , data) I’m using (RIFF, fmt , data, data, ..., data). The first issue is that the RIFF header wants the total length of the whole file, but that is of course not known when streaming the audio (I’m now using an arbitrary number). The other problem is that I'm not sure if it's valid since Audacity doesn't recognise the file, and Windows Media Player opens the file but plays only a very small part. I've been reading WAV specs but haven’t found an answer. Any suggestions?

    Read the article

  • How to use files/streams as source/sink in PulseAudio

    - by Nilesh
    I'm a PulseAudio noob, and I'm not sure if I'm even using the correct terminology. I've seen that PulseAudio can perform echo cancellation, but it needs a source and a sink to filter from, and a new source and sink. I can provide my mic and my audio-out as the source and sink, right? Now, here's my situation: I have two video streams, say, rtmp streams, or consider two flv files, say at any given moment, stream X is the input stream that's coming from another computer's webcam+mic and stream Y is the output stream that I'm sending, (and it's coming from my computer's webcam+mic). Question: Back to the first paragraph - here's the thing, I don't want to use my mic and my audio-out, instead, I want to use these two "input" and "output" streams as my source and sink so to speak (of course, I'll use xuggler maybe, to extract just the audio from X and Y). It may be a strange question, and I have my reasons for doing this strange this - I need to experiment and verify the results to see.

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >