Search Results

Search found 5304 results on 213 pages for 'audio streaming'.

Page 60/213 | < Previous Page | 56 57 58 59 60 61 62 63 64 65 66 67  | Next Page >

  • How to make VLC play .vlm config file in "With no interface mode"?

    - by Ole Jak
    How to make VLC play .vlm config file in "With no interface mode" on windows? So I have .vlm config file that should stream audio from mic to localhost so no vlc ui needed. If I say to windows "play .vlm file with vlc" it plays correctly starts server where I need and streams data. but how to do such thing manulay from cmd (so we suppouse we can call vlc.exe by vlc and we are now in folder with vlc.exe and vlcConfig.vlm file)

    Read the article

  • Converting audio files(.3gp) to video with Album cover and uploading to YouTube

    - by Samuh
    I have an audio file in .3gp format on my Android device which I wish to upload to YouTube. I know that YouTube is a video upload site and that I need to convert this sound file to video. I just want an image to display all the time the audio is playing. Google tells me there are number of tools that can help me. But I want to do this via java code from my Android device. Please help. Thanks.

    Read the article

  • Windows 7 sound control is not flexible.

    - by jon
    I would like to be able to listen to music on both of two audio output devices, but Windows 7 seems to only allow me to select one or the other as the Default device. When device A is the Default device, device B is muted; and vice versa. This seems to be stunningly inflexible. Since Windows 7 is unable to do this, can anyone recommend any add-on software that would control the hardware more flexibly and thoughtfully?

    Read the article

  • Audio Player Royalty Free Music (dynamic audios )?

    - by Surya sasidhar
    hi, I am using Royalty Free Music player for playing the audio. ya it is playing perfect but i need to play it dynamically, i mean the audio will come from database how can i write the code for that. This is the royalty free music code..... var so = new SWFObject("playerSingle.swf", "mymovie", "192", "67", "7", "#FFFFFF"); so.addVariable("autoPlay", "yes"); so.addVariable("soundPath","song.mp3"); so.addVariable("overColor","#000044") so.addVariable("playerSkin","1") so.write("flashPlayer"); this above code is written in source code with in the script tag, then how can i write for dynamic audios please help me thanking you and this is the link for that site.. http://www.premiumbeat.com/flash_resources/free_flash_music_player/single_track_flash_mp3_player.php

    Read the article

  • Use an iPhone as a bluetooth headset for a mac?

    - by Phillip Oldham
    Is there any way, such as an iPhone app, that will let me connect my iPhone to my iMac via bluetooth, so that the iMac pushes all audio through the iPhone? Specifically, what I'm looking to do is be able to watch movies on my iMac with the sound being played through my iPhone & in turn the ear buds.

    Read the article

  • Comapring pitches with digital audio

    - by user2250569
    I work on application which will compare musical notes with digital audio. My first idea was analyzes wav file (or sound in real-time) with some polyphonic pitch algorithms and gets notes and chords from this file and subsequently compared with notes in dataset. I went through a lot of pages and it seems to be a lot of hard work because existing implementations and algorithms are mainly/only focus on monophonic sound. Now, I got the idea to do this in the opposite way. In dataset I have for example note: A4 or better example chord: A4 B4 H4. And my idea is make some wave (or whatever I don't know what) from this note or chord and then compared with piece of digital audio. Is this good idea? Is it better/harder solution? If yes can you recommend me how to do it?

    Read the article

  • Running a small IPTV station

    - by nixterrimus
    I'm looking to run an iptv station for my dorm. I know I can serve multicast so that's not a problem. The station will serve out podcasts and other cc licensed content. The target endpoint is xbmc- a media center. So far I know that I need to serve an rtp stream over udp that's streaming an mpeg-4 avc main or high profile with aac ( or ac3 ?) audio. I've had some luck using vlc with vlm to stream but it seems limited. What are my other options?  Everything has to run on Linux- hopefully open source. How can I use playlists and not live streams? What are my software options?

    Read the article

  • How to get better video quality in Lync?

    - by sinned
    I want to use a ConferenceCam from Logitech to stream talks live via Lync. When I view the RAW webcam image via VLC, the quality is very good (but the latency is high because of buffering). However, when I stream it using Lync, the video gets blurry. Is there a way to ensure QoS in Lync or otherwise improve the video quality to (near-)native? I would rather have some dropped frames than a lower resolution where I can't read the slides. In my setup, I use Lync with an Office365-E3 contract, so I have no Lync-Server in my network. I thought about replacing Lync completely with VLC, but I first want to try Lync because VLC will probably cause firewall issues. Also, I haven't looked up the VLC parameters for less buffering, faster encoding, a bit lower resolution (natively it's more than HD) and streaming.

    Read the article

  • Using MythTV to Stream Satellite Signal to multiple users

    - by Ammar
    I'm planning on setting up a small media center to achieve the following: I have multiple users who want to watch satellite channels I want them to be able to change the channel remotely I'm going to buy an external DVB to capture the signal Use some software to stream the channel that is selected I want to have 2 different channels at the same time, I assume I will need two DVB cards Users use VLC or Windows Media Player or whatever to view the channels How can I achieve this? I heard MythTV can do it? I tried to do some research I couldn't find enough information. Note: There's no copyright issue here. I'm streaming Free-To-Air (FTA) channels.

    Read the article

  • Is there a way to browse a media server (MediaTomb) with a media player (VLC)?

    - by twig
    I currently have an EEE PC set up with LinuxMint as my media server using MediaTomb. I use VLC as my media player on another Windows computer to watch videos off the media server. It works fine, but the current process is: Open up browser and navigate to folder Find the file I want to play and copy URL Paste URL into VLC and watch. This is fine for me on the PC, but it is a little troublesome for my parents to grasp (or for me to use on the phone). Ideally I'd like to: Open up VLC Browse to the file (using VLC) Click/select to play If there is any solution which is similar to this, please let me know. I'm willing to change the software on both server and client to accommodate (although it somewhat depends on which formats are supported on the server) Side note: I've tried searching online for this but I find a lot of jargon such as "media server/centre", media streaming, DLNA, UPNP and feel that some people are either using them interchangeably or incorrectly.

    Read the article

  • Proven and Scalable Comet Server

    - by demetriusnunes
    What is the most proven, scalable comet server solution out there that can handle up to 100.000 real-life connections per node using HTTP streaming (not long-poll)? It must be a free, preferably open-source project. We've already tried Meteor (Perl), with no success. Meteor was able to scale just up to 20.000 connections per node. We are looking right now at these options: APE (C++), Orbited (Python), Grizzly (Glassfish), Cometd (Jetty). Any big success stories with any of these?

    Read the article

  • The specified module (mod_h264_streaming) could not be found (Apache2)?

    - by rphello101
    I'm trying to get the mod_h264_streaming to work with my Apache2 server. I downloaded a precompiled version of the mod from here. I read here that all I have to do is extract the file to my modules folder, which I did, and add LoadModule h264_streaming_module modules/mod_h264_streaming.so AddHandler h264-streaming.extensions .mp4 to the httpd.conf, which I also did. However, I get this error when I restart Apache: Syntax error on line 173 of C:/Program Files (x86)/Apache Group/Apache2/conf/httpd.conf: Cannot load C:/Program Files (x86)/Apache Group/Apache2/modules/mod_h264_streaming.so into server: The specified module could not be found. Note the errors or messages above, and press the <ESC> key to exit. 26... Even though the file exists right here: C:\Program Files (x86)\Apache Group\Apache2\modules\mod_h264_streaming.so Can anyone tell me what I'm doing wrong?

    Read the article

  • Download and Convert live stream video

    - by IcySnow
    I want to download live streaming video from some websites and it seems Internet Download Manager can handle the job. However, the video I want is just a small part of the live stream and the the stream itself never stops. Hence, IDM will just keep downloading all days and nights if I don't stop it myself. The problem is, the downloaded file (stored in the temporary folder) is of .stream extension. Mediaplayer Classic can open it perfectly, but it'd be very inconvenient to have such extension since I don't think I can carry it around and play it on other computer. I tried some video converters but all of them failed because the format is not supported. So my questions are: Is there a program specially made to download live rmtp video? IDM works, but the format is inconvenient. How can I convert .stream file to other extension, say AVI?

    Read the article

  • Video Hosting External or Internal?

    - by user69334
    I have a client who wants to offer videos for downloading or streaming on his website. Now we have hosting for him, which is wonderful: it's reliable and fast, offers unlimited space and databases BUT it only offers 10GB in bandwidth. The videos could easily be placed on his server (unlimited space) but the bandwidth is a real problem. Now I am wondering if he would purchase external hosting for his videos, and people would want to download them, woould this still eat up all his available bandwidth in no time, because the download request would go via his site, or is there a way to circumvent this?

    Read the article

  • Lagging digital tv over ethernet

    - by Steve
    I have a HD Home Run TV over ethernet device, which connects the aerial to my router, and from there the router connects to my PC over about 15m of 100Mbps ethernet cable. The TV output lags every second. It does not do so for a computer much closer to the router. It is odd to me that the network rate is around 7Mbps on a 100Mbps cable. I am not downloading or streaming anything else on the affected computer. Is this lag caused by the speed of the cable, the length of the cable, or interference on the cable? I am considering swapping the ethernet cable with shielded ethernet cable.

    Read the article

  • Windows app that can stream video over local net with remote browsing

    - by Nifle
    What I want: An app that runs on a windows 7 computer that can stream movies to the other computers in the house. I want to be able to brows the movies on my "server" from the other computers and start streaming. I'm aware that VLC can do some (all?) of this but it's looks to complicated for my 6year old to use. Ideally I'd like something "Air Video Server" but for mac/win computers with a native client for browsing the movies on the PC running as server.

    Read the article

  • Content delivery: Alternatives to SHOUTcast

    - by polemon
    I've been using Icecast and SHOUTcast for several years now, to deliver audio and Video content. I wonder what kind of alternatives do I have to those two, especially when streaming video. On the client side, what software can I choose from to stream to those servers live? Making the streams available with Flash would be great, but that's maybe another story. We're still using EdCast, which is kinda dead by now, Shoutcast DSP plugin is not an option, as this tends to crash, etc. Are there any alternatives to that, when livestreaming video? I'm using Liquidsoap for content generation, mixing, etc. It should work with Icecast, Liquidsoap and if possible SHOUTcast.

    Read the article

  • ffmpeg error while segmenting

    - by Tommy Ng
    I'm using ffmpeg and segmenter on Ubuntu 10.04 to create the transport stream from flv/h264 video files and then segment the ts segments for ipad streaming. Some ts files show an error with segmenter - Output #0, mpegts, to '29': Stream #0.0: Video: 0x0000, yuv420p, 480x360, q=2-31, 90k tbn, 25 tbc Stream #0.1: Audio: 0x0000, 0 channels, s16 [mpegts @ 0x11f4ac0]sample rate not set Could not write mpegts header to first output file my ffmpeg command for creating the ts file - ffmpeg -i 1.flv -f mpegts -acodec libfaac -ar 48000 -ab 64k -s 480x360 -vcodec libx264 -b 192k -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 5 -trellis 1 -refs 1 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate 192k -bufsize 192k -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -aspect 480:360 -g 30 -async 2 -y 1.ts my segmenter command - segmenter 1.ts 10 1 1.m3u8 path/to/streams/

    Read the article

  • How to host my own cloud so that videos are viewable via desktop web browser?

    - by jake9115
    I want to host my own cloud storage solution, something like Dropbox but entirely dependent on my own central machine. This way things are more secure if setup correctly, and there are artificial storage limitations or pay-walls. Some thing similar to ownCloud: http://owncloud.org/ There is one important feature I want to have: the ability the stream movies in a web browser from my personal cloud to anywhere in the world. In the past I tried this with a NAS, and I mapped XBMC to the NAS via SFTP, and certain media types could stream in this manner. I've also used things like PLEX. In this case, I am looking for a single solution for personal cloud storage and movie streaming from that cloud into a web browser. Does anyone know if this can be accomplished? Thanks for the suggestions!

    Read the article

  • How to synchronize tasks on multiple computers

    - by SysGen
    I would like to be able to run similar tasks on several computers that must be precisely synced. More specifically I need 4 laptops to be synced, probably over local network, and I need to use one of them to start a task (play a video) on all of them at the same time (different video files on different laptops). All of them are running Windows. Is there a third-party software or any easier way to do so over LAN without significant delay? I need virtually no response time - if there would be a single video streaming on all laptops then people should not recognize the delay.

    Read the article

  • What video format(s) should be used to serve Macs, PCs, and Mobile Devices?

    - by Jeffrey Blake
    In 2007, I started a site based on streaming and downloading poker strategy videos. At that point in time, the best solution I came up with for supporting users of Macs and PCs was to provide the videos in both WMV and FLV formats. Later we added an M4V version to support iPhones/iPods. Obviously, things have changed a bit since that time. I would like to revisit our format decision to see if there is anything better that we could offer, preferrably with wider support among all devices (so that we can reduce the number of formats offered, if possible). Is FLV + WMV + M4V the best solution? Is there something else we should consider? What about Android devices?

    Read the article

  • VLC Caching levels

    - by Svish
    When I open the Preferences of VLC and go to Input & Codecs, I have a setting called Default Caching Level. I can choose between Cusom Lowest latency Low latency Normal High latency Higher latency I'm used to caching being set in seconds or something like that. So, more seconds/higher buffer means less chane of buffer underrun while streaming. What is latency? What does it mean to set it lower or higher? In what cases should I go in what direction? If I'm struggling with buffer underruns, should I set it to lower or higher latency?

    Read the article

  • Play and record streaming audio

    - by Igor
    I'm working on an iPhone app that should be able to play and record audio streaming data simultaneously. Is it actually possible? I'm trying to mix SpeakHere and AudioRecorder samples and getting an empty file with no audio data... Here is my .m code: import "AzRadioViewController.h" @implementation azRadioViewController static const CFOptionFlags kNetworkEvents = kCFStreamEventOpenCompleted | kCFStreamEventHasBytesAvailable | kCFStreamEventEndEncountered | kCFStreamEventErrorOccurred; void MyAudioQueueOutputCallback( void* inClientData, AudioQueueRef inAQ, AudioQueueBufferRef inBuffer, const AudioTimeStamp inStartTime, UInt32 inNumberPacketDescriptions, const AudioStreamPacketDescription inPacketDesc ) { NSLog(@"start MyAudioQueueOutputCallback"); MyData* myData = (MyData*)inClientData; NSLog(@"--- %i", inNumberPacketDescriptions); if(inNumberPacketDescriptions == 0 && myData-dataFormat.mBytesPerPacket != 0) { inNumberPacketDescriptions = inBuffer-mAudioDataByteSize / myData-dataFormat.mBytesPerPacket; } OSStatus status = AudioFileWritePackets(myData-audioFile, FALSE, inBuffer-mAudioDataByteSize, inPacketDesc, myData-currentPacket, &inNumberPacketDescriptions, inBuffer-mAudioData); if(status == 0) { myData-currentPacket += inNumberPacketDescriptions; } NSLog(@"status:%i curpac:%i pcdesct: %i", status, myData-currentPacket, inNumberPacketDescriptions); unsigned int bufIndex = MyFindQueueBuffer(myData, inBuffer); pthread_mutex_lock(&myData-mutex); myData-inuse[bufIndex] = false; pthread_cond_signal(&myData-cond); pthread_mutex_unlock(&myData-mutex); } OSStatus StartQueueIfNeeded(MyData* myData) { NSLog(@"start StartQueueIfNeeded"); OSStatus err = noErr; if (!myData-started) { err = AudioQueueStart(myData-queue, NULL); if (err) { PRINTERROR("AudioQueueStart"); myData-failed = true; return err; } myData-started = true; printf("started\n"); } return err; } OSStatus MyEnqueueBuffer(MyData* myData) { NSLog(@"start MyEnqueueBuffer"); OSStatus err = noErr; myData-inuse[myData-fillBufferIndex] = true; AudioQueueBufferRef fillBuf = myData-audioQueueBuffer[myData-fillBufferIndex]; fillBuf-mAudioDataByteSize = myData-bytesFilled; err = AudioQueueEnqueueBuffer(myData-queue, fillBuf, myData-packetsFilled, myData-packetDescs); if (err) { PRINTERROR("AudioQueueEnqueueBuffer"); myData-failed = true; return err; } StartQueueIfNeeded(myData); return err; } void WaitForFreeBuffer(MyData* myData) { NSLog(@"start WaitForFreeBuffer"); if (++myData-fillBufferIndex = kNumAQBufs) myData-fillBufferIndex = 0; myData-bytesFilled = 0; myData-packetsFilled = 0; printf("-lock\n"); pthread_mutex_lock(&myData-mutex); while (myData-inuse[myData-fillBufferIndex]) { printf("... WAITING ...\n"); pthread_cond_wait(&myData-cond, &myData-mutex); } pthread_mutex_unlock(&myData-mutex); printf("<-unlock\n"); } int MyFindQueueBuffer(MyData* myData, AudioQueueBufferRef inBuffer) { NSLog(@"start MyFindQueueBuffer"); for (unsigned int i = 0; i < kNumAQBufs; ++i) { if (inBuffer == myData-audioQueueBuffer[i]) return i; } return -1; } void MyAudioQueueIsRunningCallback( void* inClientData, AudioQueueRef inAQ, AudioQueuePropertyID inID) { NSLog(@"start MyAudioQueueIsRunningCallback"); MyData* myData = (MyData*)inClientData; UInt32 running; UInt32 size; OSStatus err = AudioQueueGetProperty(inAQ, kAudioQueueProperty_IsRunning, &running, &size); if (err) { PRINTERROR("get kAudioQueueProperty_IsRunning"); return; } if (!running) { pthread_mutex_lock(&myData-mutex); pthread_cond_signal(&myData-done); pthread_mutex_unlock(&myData-mutex); } } void MyPropertyListenerProc( void * inClientData, AudioFileStreamID inAudioFileStream, AudioFileStreamPropertyID inPropertyID, UInt32 * ioFlags) { NSLog(@"start MyPropertyListenerProc"); MyData* myData = (MyData*)inClientData; OSStatus err = noErr; printf("found property '%c%c%c%c'\n", (inPropertyID24)&255, (inPropertyID16)&255, (inPropertyID8)&255, inPropertyID&255); switch (inPropertyID) { case kAudioFileStreamProperty_ReadyToProducePackets : { AudioStreamBasicDescription asbd; UInt32 asbdSize = sizeof(asbd); err = AudioFileStreamGetProperty(inAudioFileStream, kAudioFileStreamProperty_DataFormat, &asbdSize, &asbd); if (err) { PRINTERROR("get kAudioFileStreamProperty_DataFormat"); myData-failed = true; break; } err = AudioQueueNewOutput(&asbd, MyAudioQueueOutputCallback, myData, NULL, NULL, 0, &myData-queue); if (err) { PRINTERROR("AudioQueueNewOutput"); myData-failed = true; break; } for (unsigned int i = 0; i < kNumAQBufs; ++i) { err = AudioQueueAllocateBuffer(myData-queue, kAQBufSize, &myData-audioQueueBuffer[i]); if (err) { PRINTERROR("AudioQueueAllocateBuffer"); myData-failed = true; break; } } UInt32 cookieSize; Boolean writable; err = AudioFileStreamGetPropertyInfo(inAudioFileStream, kAudioFileStreamProperty_MagicCookieData, &cookieSize, &writable); if (err) { PRINTERROR("info kAudioFileStreamProperty_MagicCookieData"); break; } printf("cookieSize %d\n", cookieSize); void* cookieData = calloc(1, cookieSize); err = AudioFileStreamGetProperty(inAudioFileStream, kAudioFileStreamProperty_MagicCookieData, &cookieSize, cookieData); if (err) { PRINTERROR("get kAudioFileStreamProperty_MagicCookieData"); free(cookieData); break; } err = AudioQueueSetProperty(myData-queue, kAudioQueueProperty_MagicCookie, cookieData, cookieSize); free(cookieData); if (err) { PRINTERROR("set kAudioQueueProperty_MagicCookie"); break; } err = AudioQueueAddPropertyListener(myData-queue, kAudioQueueProperty_IsRunning, MyAudioQueueIsRunningCallback, myData); if (err) { PRINTERROR("AudioQueueAddPropertyListener"); myData-failed = true; break; } break; } } } static void ReadStreamClientCallBack(CFReadStreamRef stream, CFStreamEventType type, void *clientCallBackInfo) { NSLog(@"start ReadStreamClientCallBack"); if(type == kCFStreamEventHasBytesAvailable) { UInt8 buffer[2048]; CFIndex bytesRead = CFReadStreamRead(stream, buffer, sizeof(buffer)); if (bytesRead < 0) { } else if (bytesRead) { OSStatus err = AudioFileStreamParseBytes(globalMyData-audioFileStream, bytesRead, buffer, 0); if (err) { PRINTERROR("AudioFileStreamParseBytes"); } } } } void MyPacketsProc(void * inClientData, UInt32 inNumberBytes, UInt32 inNumberPackets, const void * inInputData, AudioStreamPacketDescription inPacketDescriptions) { NSLog(@"start MyPacketsProc"); MyData myData = (MyData*)inClientData; printf("got data. bytes: %d packets: %d\n", inNumberBytes, inNumberPackets); for (int i = 0; i < inNumberPackets; ++i) { SInt64 packetOffset = inPacketDescriptions[i].mStartOffset; SInt64 packetSize = inPacketDescriptions[i].mDataByteSize; size_t bufSpaceRemaining = kAQBufSize - myData-bytesFilled; if (bufSpaceRemaining < packetSize) { MyEnqueueBuffer(myData); WaitForFreeBuffer(myData); } AudioQueueBufferRef fillBuf = myData-audioQueueBuffer[myData-fillBufferIndex]; memcpy((char*)fillBuf-mAudioData + myData-bytesFilled, (const char*)inInputData + packetOffset, packetSize); myData-packetDescs[myData-packetsFilled] = inPacketDescriptions[i]; myData-packetDescs[myData-packetsFilled].mStartOffset = myData-bytesFilled; myData-bytesFilled += packetSize; myData-packetsFilled += 1; size_t packetsDescsRemaining = kAQMaxPacketDescs - myData-packetsFilled; if (packetsDescsRemaining == 0) { MyEnqueueBuffer(myData); WaitForFreeBuffer(myData); } } } (IBAction)buttonPlayPressedid)sender { label.text = @"Buffering"; [self connectionStart]; } (IBAction)buttonSavePressedid)sender { NSLog(@"save"); AudioFileClose(myData.audioFile); AudioQueueDispose(myData.queue, TRUE); } bool getFilename(char* buffer,int maxBufferLength) { NSArray paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString docDir = [paths objectAtIndex:0]; NSString* file = [docDir stringByAppendingString:@"/rec.caf"]; return [file getCString:buffer maxLength:maxBufferLength encoding:NSUTF8StringEncoding]; } -(void)connectionStart { @try { MyData* myData = (MyData*)calloc(1, sizeof(MyData)); globalMyData = myData; pthread_mutex_init(&myData-mutex, NULL); pthread_cond_init(&myData-cond, NULL); pthread_cond_init(&myData-done, NULL); NSLog(@"Start"); myData-dataFormat.mSampleRate = 16000.0f; myData-dataFormat.mFormatID = kAudioFormatLinearPCM; myData-dataFormat.mFramesPerPacket = 1; myData-dataFormat.mChannelsPerFrame = 1; myData-dataFormat.mBytesPerFrame = 2; myData-dataFormat.mBytesPerPacket = 2; myData-dataFormat.mBitsPerChannel = 16; myData-dataFormat.mReserved = 0; myData-dataFormat.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger | kLinearPCMFormatFlagIsPacked; int i, bufferByteSize; UInt32 size; AudioQueueNewInput( &myData-dataFormat, MyAudioQueueOutputCallback, &myData, NULL /* run loop /, kCFRunLoopCommonModes / run loop mode /, 0 / flags */, &myData-queue); size = sizeof(&myData-dataFormat); AudioQueueGetProperty(&myData-queue, kAudioQueueProperty_StreamDescription, &myData-dataFormat, &size); CFURLRef fileURL; char path[256]; memset(path,0,sizeof(path)); getFilename(path,256); fileURL = CFURLCreateFromFileSystemRepresentation(NULL, (UInt8*)path, strlen(path), FALSE); AudioFileCreateWithURL(fileURL, kAudioFileCAFType, &myData-dataFormat, kAudioFileFlags_EraseFile, &myData-audioFile); OSStatus err = AudioFileStreamOpen(myData, MyPropertyListenerProc, MyPacketsProc, kAudioFileMP3Type, &myData-audioFileStream); if (err) { PRINTERROR("AudioFileStreamOpen"); return 1; } CFStreamClientContext ctxt = {0, self, NULL, NULL, NULL}; CFStringRef bodyData = CFSTR(""); // Usually used for POST data CFStringRef headerFieldName = CFSTR("X-My-Favorite-Field"); CFStringRef headerFieldValue = CFSTR("Dreams"); CFStringRef url = CFSTR(RADIO_LOCATION); CFURLRef myURL = CFURLCreateWithString(kCFAllocatorDefault, url, NULL); CFStringRef requestMethod = CFSTR("GET"); CFHTTPMessageRef myRequest = CFHTTPMessageCreateRequest(kCFAllocatorDefault, requestMethod, myURL, kCFHTTPVersion1_1); CFHTTPMessageSetBody(myRequest, bodyData); CFHTTPMessageSetHeaderFieldValue(myRequest, headerFieldName, headerFieldValue); CFReadStreamRef stream = CFReadStreamCreateForHTTPRequest(kCFAllocatorDefault, myRequest); if (!stream) { NSLog(@"Creating the stream failed"); return; } if (!CFReadStreamSetClient(stream, kNetworkEvents, ReadStreamClientCallBack, &ctxt)) { CFRelease(stream); NSLog(@"Setting the stream's client failed."); return; } CFReadStreamScheduleWithRunLoop(stream, CFRunLoopGetCurrent(), kCFRunLoopCommonModes); if (!CFReadStreamOpen(stream)) { CFReadStreamSetClient(stream, 0, NULL, NULL); CFReadStreamUnscheduleFromRunLoop(stream, CFRunLoopGetCurrent(), kCFRunLoopCommonModes); CFRelease(stream); NSLog(@"Opening the stream failed."); return; } } @catch (NSException *exception) { NSLog(@"main: Caught %@: %@", [exception name], [exception reason]); } } (void)viewDidLoad { [[UIApplication sharedApplication] setIdleTimerDisabled:YES]; [super viewDidLoad]; } (void)didReceiveMemoryWarning { [super didReceiveMemoryWarning]; } (void)viewDidUnload { } (void)dealloc { [super dealloc]; } @end

    Read the article

  • Problem with waveOutWrite and waveOutGetPosition deadlock

    - by MusiGenesis
    I'm working on an app that plays audio continuously using the waveOut... API from winmm.dll. The app uses "leapfrog" buffers, which are basically a bunch of arrays of samples that you dump into the audio queue. Windows plays them seamlessly in sequence, and as each buffer completes Windows calls a callback function. Inside this function, I load the next set of samples into the buffer, process them however, and then dump the buffer back into the audio queue. In this way, the audio plays indefinitely. For animation purposes, I'm trying to incorporate waveOutGetPosition into the application (since the "buffer done" callbacks are irregular enough to cause jerky animation). waveOutGetPosition returns the current position of playback, so it's hyper-precise. The problem is that in my application, making calls to waveOutGetPosition eventually causes the application to lock up - the sound stops and the call never returns. I've boiled things down to a simple app that demonstrates the problem. You can run the app here: http://www.musigenesis.com/SO/waveOut%20demo.exe If you just hear a tiny bit of piano over and over, it's working. It's just meant to demonstrate the problem. The source code for this project is here: http://www.musigenesis.com/SO/WaveOutDemo.zip The first button runs the app in leapfrog mode without making the calls to waveOutGetPosition. If you click this, the app will play forever without breaking (the X button will close it and shut it off). The second button starts the leapfrogger and also starts a forms timer that calls the waveOutGetPosition and displays the current position. Click this and the app will run for a short while and then lock up. On my laptop, it usually locks up in 15-30 seconds; at most it's taken a minute. I have no idea how to fix this, so any help or suggestions would be most welcome. I've found very few posts on this issue, but it seems that there is a potential deadlock, either from multiple calls to waveOutGetPosition or from calls to that and waveOutWrite that occur at the same time. It's possible that I'm calling this too frequently for the system to handle.

    Read the article

  • i2s0: transmitter underrun (0)

    - by tbarbe
    were doing some audio stuff and I keep seeing this in the Organizer Console. Sun May 2 20:16:48 unknown kernel[0] : i2s0: transmitter underrun (0) Are these transmitter underruns bad? I think its just when were shutting down audio input...but could a few of these cause some issues later on?

    Read the article

< Previous Page | 56 57 58 59 60 61 62 63 64 65 66 67  | Next Page >