Search Results

Search found 12365 results on 495 pages for 'core audio'.

Page 116/495 | < Previous Page | 112 113 114 115 116 117 118 119 120 121 122 123  | Next Page >

  • SoundPool.load() and FileDescriptor from file

    - by Hans
    I tried using the load function of the SoundPool that takes a FileDescriptor, because I wanted to be able to set the offset and length. The File is not stored in the Ressources but a file on the storage card. Even though neither the load nor the play function of the SoundPool throw any Exception or print anything to the console, the sound is not played. Using the same code, but use the file path string in the SoundPool constructor works perfectly. This is how I have tried the loading (start equals 0 and length is the length of the file in miliseconds): FileInputStream fileIS = new FileInputStream(new File(mFile)); mStreamID = mSoundPool.load(fileIS.getFD(), start, length, 0); mPlayingStreamID = mSoundPool.play(mStreamID, 1f, 1f, 1, 0, 1f); If I would use this, it works: mStreamID = mSoundPool.load(mFile, 0); mPlayingStreamID = mSoundPool.play(mStreamID, 1f, 1f, 1, 0, 1f); Any ideas anyone? Thanks

    Read the article

  • CorePlot - Set y-Axis range to include all data points

    - by zdestiny
    I have implemented a CorePlot graph in my iOS application, however I am unable to dynamically size the y-Axis based on the data points in the set. Some datasets range from 10-50, while others would range from 400-500. In both cases, I would like to have y-origin (0) visible. I have tried using the scaletofitplots method: [graph.defaultPlotSpace scaleToFitPlots:[graph allPlots]]; but this has no effect on the graphs, they still show the default Y-Axis range of 0 to 1. The only way that I can change the y-Axis is to do it manually via this: graph.defaultPlotSpace.yRange = [CPTPlotRange plotRangeWithLocation:CPTDecimalFromFloat(0.0f)) length:CPTDecimalFromFloat(500.0f)]; however this doesn't work well for graphs with smaller ranges (10-50), as you can imagine. Is there any method to retrieve the highest y-value from my dataset so that I can manually set the maximum y-Axis to that value? Or is there a better alternative?

    Read the article

  • After I apply custom logic, next UI action crashes my app.

    - by DanF
    I've got an team (eveningRoster) that I'm making a button add employees to. The team is really a relationship to that night's event, but it's represented with an AC. I wanted to make sure an employee did not belong to the team before it adds, so I added a method to MyDocument to check first. It seems to work, the error logs complete, but after I've added a member, the next time I click anything, the program crashes. Any guesses why? Here's the code: -(IBAction)playsTonight:(id)sender { NSArray *selection = [fullRoster selectedObjects]; NSArray *existing = [eveningRoster arrangedObjects]; //Result will be within each loop. BOOL result; //noDuplicates will stay YES until a duplicate is found. BOOL noDuplicates = YES; //For the loop: int count; for (count = 0; count < [selection count]; count++){ result = [existing containsObject:[selection objectAtIndex:count]]; if (result == YES){ NSLog(@"Duplicate found!"); noDuplicates = NO; } } if (noDuplicates == YES){ [eveningRoster addObjects:[fullRoster selectedObjects]]; NSLog(@"selected objects added."); [eveningTable reloadData]; NSLog(@"Table reloaded."); } [selection release]; [existing release]; return; }

    Read the article

  • Connecting CoreData to my App on an iPhone Device

    - by Lauren Quantrell
    I apologize ahead of time for what I'm sure is a complete newbie lapse. Running my iPhone app on iPhone simulator - no problem. But I loaded the app on an iPhone device for the first time and it appears as if the SQLite database I'm using (NSManagedObjectContext) isn't connected or didn't upload. The app installs but with no data. How do I get it all to upload and work on the device? I appreciate any help. lq

    Read the article

  • AudioRecord problems with non-HTC devices

    - by Marc
    I'm having troubles using AudioRecord. An example using some of the code derived from the splmeter project: private static final int FREQUENCY = 8000; private static final int CHANNEL = AudioFormat.CHANNEL_CONFIGURATION_MONO; private static final int ENCODING = AudioFormat.ENCODING_PCM_16BIT; private int BUFFSIZE = 50; private AudioRecord recordInstance = null; ... android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO); recordInstance = new AudioRecord(MediaRecorder.AudioSource.MIC, FREQUENCY, CHANNEL, ENCODING, 8000); recordInstance.startRecording(); short[] tempBuffer = new short[BUFFSIZE]; int retval = 0; while (this.isRunning) { for (int i = 0; i < BUFFSIZE - 1; i++) { tempBuffer[i] = 0; } retval = recordInstance.read(tempBuffer, 0, BUFFSIZE); ... // process the data } This works on the HTC Dream and the HTC Magic perfectly without any log warnings/errors, but causes problems on the emulators and Nexus One device. On the Nexus one, it simply never returns useful data. I cannot provide any other useful information as I'm having a remote friend do the testing. On the emulators (Android 1.5, 2.1 and 2.2), I get weird errors from the AudioFlinger and Buffer overflows with the AudioRecordThread. I also get a major slowdown in UI responsiveness (even though the recording takes place in a separate thread than the UI). Is there something apparent that I'm doing incorrectly? Do I have to do anything special for the Nexus One hardware?

    Read the article

  • Migrating from a single entity to an abstract parent entity with child entities, NSEntityMigrationPolicy not called.

    - by Jimmy Selgen Nielsen
    Hi. I'm trying to upgrade my current application to use an abstract parent entity, with specialized sub entities. I've created a custom NSEntityMigrationPolicy, and in the mapping model I've set the Custom Policy to the name of my class. I'm initializing my persistent store like this, which should be fairly standard : NSError *error=nil; persistentStoreCoordinator = [[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel: [self managedObjectModel]]; NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], NSMigratePersistentStoresAutomaticallyOption, nil]; if (![persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType configuration:nil URL:storeUrl options:options error:&error]) { NSLog(@"Error adding persistent store : %@",[error description]); NSAssert(error==nil,[error localizedDescription]); } When i run the app i get the following error : Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'The operation couldn’t be completed. (Cocoa error 134140.)' [error userInfo] contains "reason=Can't find mapping model for migration" I've verified that version 1 of the data model will open, and if i set NSInferMappingModelAutomaticallyOption i get a migration, although my entities are not migrated correctly (as expected). I've verified that the mapping model (cdm) is in the application bundle, but somehow it refuses to find it. I've also set breakpoints and NSLog() statements in the custom migration policy, and none of it runs, with or without NSInferMappingModelAutomaticallyOption Any hints as to why it seems unable to find the mapping model ?

    Read the article

  • CoreData: Same predicate (IN) returns different fetched results after a Save operation

    - by Jason Lee
    I have code below: NSArray *existedTasks = [[TaskBizDB sharedInstance] fetchTasksWatchedByMeOfProject:projectId]; [context save:&error]; existedTasks = [[TaskBizDB sharedInstance] fetchTasksWatchedByMeOfProject:projectId]; NSArray *allTasks = [[TaskBizDB sharedInstance] fetchTasksOfProject:projectId]; First line returns two objects; Second line save the context; Third line returns just one object, which is contained in the 'two objects' above; And the last line returns 6 objects, containing the 'two objects' returned at the first line. The fetch interface works like below: WXModel *model = [WXModel modelWithEntity:NSStringFromClass([WQPKTeamTask class])]; NSPredicate *predicate = [NSPredicate predicateWithFormat:@"(%@ IN personWatchers) AND (projectId == %d)", currentLoginUser, projectId]; [model setPredicate:predicate]; NSArray *fetchedTasks = [model fetch]; if (fetchedTasks.count == 0) return nil; return fetchedTasks; What confused me is that, with the same fetch request, why return different results just after a save? Here comes more detail: The 'two objects' returned at the first line are: <WQPKTeamTask: 0x1b92fcc0> (entity: WQPKTeamTask; id: 0x1b9300f0 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WQPKTeamTask/p9> ; data: { projectId = 372004; taskId = 338001; personWatchers = ( "0xf0bf440 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WWPerson/p1>" ); } <WQPKTeamTask: 0xf3f6130> (entity: WQPKTeamTask; id: 0xf3cb8d0 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WQPKTeamTask/p11> ; data: { projectId = 372004; taskId = 340006; personWatchers = ( "0xf0bf440 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WWPerson/p1>" ); } And the only one object returned at third line is: <WQPKTeamTask: 0x1b92fcc0> (entity: WQPKTeamTask; id: 0x1b9300f0 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WQPKTeamTask/p9> ; data: { projectId = 372004; taskId = 338001; personWatchers = ( "0xf0bf440 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WWPerson/p1>" ); } Printing description of allTasks: <_PFArray 0xf30b9a0>( <WQPKTeamTask: 0xf3ab9d0> (entity: WQPKTeamTask; id: 0xf3cda40 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WQPKTeamTask/p6> ; data: <fault>), <WQPKTeamTask: 0xf315720> (entity: WQPKTeamTask; id: 0xf3c23a0 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WQPKTeamTask/p7> ; data: <fault>), <WQPKTeamTask: 0xf3a1ed0> (entity: WQPKTeamTask; id: 0xf3cda30 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WQPKTeamTask/p8> ; data: <fault>), <WQPKTeamTask: 0x1b92fcc0> (entity: WQPKTeamTask; id: 0x1b9300f0 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WQPKTeamTask/p9> ; data: { projectId = 372004; taskId = 338001; personWatchers = ( "0xf0bf440 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WWPerson/p1>" ); }), <WQPKTeamTask: 0xf325e50> (entity: WQPKTeamTask; id: 0xf343820 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WQPKTeamTask/p10> ; data: <fault>), <WQPKTeamTask: 0xf3f6130> (entity: WQPKTeamTask; id: 0xf3cb8d0 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WQPKTeamTask/p11> ; data: { projectId = 372004; taskId = 340006; personWatchers = ( "0xf0bf440 <x-coredata://CFFD3F8B-E613-4DE8-85AA-4D6DD08E88C5/WWPerson/p1>" ); }) ) UPDATE 1 If I call the same interface fetchTasksWatchedByMeOfProject: in: #pragma mark - NSFetchedResultsController Delegate - (void)controllerDidChangeContent:(NSFetchedResultsController *)controller { I will get 'two objects' as well. UPDATE 2 I've tried: NSPredicate *predicate = [NSPredicate predicateWithFormat:@"(ANY personWatchers == %@) AND (projectId == %d)", currentLoginUser, projectId]; NSPredicate *predicate = [NSPredicate predicateWithFormat:@"(ANY personWatchers.personId == %@) AND (projectId == %d)", currentLoginUserId, projectId]; Still the same result. UPDATE 3 I've checked the save:&error, error is nil.

    Read the article

  • Cancel a UIView animation?

    - by Phil Nash
    Is it possible to cancel a UIView animation while it is in progress? Or would I have to drop to the CA level? i.e. I've done something like this (maybe setting an end animation action too): [UIView beginAnimations:nil context:NULL]; [UIView setAnimationDuration:duration]; [UIView setAnimationCurve: UIViewAnimationCurveLinear]; // other animation properties // set view properties [UIView commitAnimations]; But before the animation completes and I get the animation ended event, I want to cancel it (cut it short). Is this possible? Googling around finds a few people asking the same question with no answers - and one or two people speculating that it can't be done.

    Read the article

  • OpenAL device, buffer and context relationship

    - by Markus
    I'm trying to create an object oriented model to wrap OpenAL and have a little problem understanding the devices, buffers and contexts. From what I can see in the Programmer's Guide, there are multiple devices, each of which can have multiple contexts as well as multiple buffers. Each context has a listener, and the alListener*() functions all operate on the listener of the active context. (Meaning that I have to make another context active first if I wanted to change it's listener, if I got that right.) So far, so good. What irritates me though is that I need to pass a device to the alcCreateContext() function, but none to alGenBuffers(). How does this work then? When I open multiple devices, on which device are the buffers created? Are the buffers shared between all devices? What happens to the buffers if I close all open devices? (Or is there something I missed?)

    Read the article

  • Properly maintain sorted state of Array/Set

    - by Jeff
    I'm trying to get data out of my MOC and then create some new objects based on those objects, and put it all back together, while keeping my sort state. The securities come out of the MOC in proper order. And everything seems to be fine until I do the assignment to the game at the bottom from setWithArray. The documentation says that setWithArray removed the duplicate objects, if there are any. I'm wonder if that's messing up my data, but I don't see a good alternative. The data is ultimately being pulled out into a UITableView. When I add items to the game manually, then they stay sorted, so I don't think the breaking of the sort is beyond the scope of what I've included here. NSError *error; NSArray *allTheSecurities = [managedObjectContext executeFetchRequest:request error:&error]; if (allTheSecurities == nil) { // Handle the error. } [request release]; /**/ NSLog( @"Enumerate..." ); NSEnumerator *enumerator = [allTheSecurities objectEnumerator]; id anObject; NSMutableArray *portfolioStocks = [[NSMutableArray alloc] init]; while (anObject = [enumerator nextObject]) { NSLog( @"Iteration... %@", [anObject name] ); NSLog( @"Build a stock..." ); PortfolioStocks *this_stock = (PortfolioStocks *)[NSEntityDescription insertNewObjectForEntityForName:@"PortfolioStocks" inManagedObjectContext:context]; NSLog( @"Set a value..." ); [this_stock setSecurity:(Security *)anObject]; [this_stock setQuantity:[NSNumber numberWithInt:0]]; NSLog( @"Add to portfolioStocks..." ); [portfolioStocks addObject:this_stock]; } //Sorted properly up to here! NSLog( @"Add to portfolio..." ); [game setPortfolio:[NSSet setWithArray:portfolioStocks]]; // <-- This is where it's not sorted anymore.

    Read the article

  • [Cocoa] Temporarily disabled NSArrayController filterPredicate, or consult ManagedObjectContext?

    - by ndg
    I have an NSArrayController which is bound to a class in my Managed Object Context. During runtime the NSArrayController can have a number of different filter predicates applied. At certain intervals, I want to iterate through my NSArrayController's contents regardless of the filter predicate applied to it. To do this, I set the filterPredicate to nil and then reinstate it after having iterated through my array. This seems to work, but I'm wondering if it's best practice? Should I instead be polling my Managed Object Context manually? NSPredicate *predicate = nil; predicate = [myArrayController filterPredicate]; [myArrayController setFilterPredicate:nil]; for(MyManagedObject *object in [myArrayController arrangedObjects]) { // ... } [myArrayController setFilterPredicate:predicate];

    Read the article

  • Unit Testing Model Classes that derive from NSManagedObject

    - by Matt Baker
    So...I'm trying to get unit tests set up in my iPhone App but I'm having some issues. I'm trying to test my model classes but they inherit directly from NSManagedObject. I'm sure this is a problem but I don't know how to get around it. Everything is building and running as expected but I get this error when calling any method on the class I'm testing: Unknown.m:0:0 unrecognized selector sent to instance 0xc2b120 If I follow this structure (http://chanson.livejournal.com/115621.html) to create my object in my tests I end up with another error entirely but it still doesn't help me. Basically my question is this: how can I test a class that inherits from NSManagedObject?

    Read the article

  • iOS 5 - Coredata Sqlite DB losing data after killing app

    - by Brian Boyle
    I'm using coredata with a sqlite DB to persist data in my app. However, each time I kill my app I lose any data that was saved in the DB. I'm pretty sure its because the .sqlite file for my DB is just being replaced by a fresh one each time my app starts, but I can't seem to find any code that will just use the existing one thats there. It would be great if anyone could point me towards some code that could handle this for me. Cheers B - (NSPersistentStoreCoordinator *)persistentStoreCoordinator { if (__persistentStoreCoordinator != nil) { return __persistentStoreCoordinator; } NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], NSMigratePersistentStoresAutomaticallyOption, [NSNumber numberWithBool:YES], NSInferMappingModelAutomaticallyOption, nil]; NSURL *storeURL = [[self applicationDocumentsDirectory] URLByAppendingPathComponent:@"FlickrCoreData.sqlite"]; NSError *error = nil; __persistentStoreCoordinator = [[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel:[self managedObjectModel]]; if (![__persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType configuration:nil URL:storeURL options:options error:&error]) { NSLog(@"Unresolved error %@, %@", error, [error userInfo]); abort(); } return __persistentStoreCoordinator; }

    Read the article

  • Is it good choice to move a Sublayer around a view using UIPanGestureRecognizer?

    - by Pranjal Bikash Das
    I have a CALayer and as sublayer to this CALayer i have added an imageLayer which contains an image of resolution 276x183. I added a UIPanGestureRecognizer to the main view and calculation the coordinates of the CALayer as follows: - (void)panned:(UIPanGestureRecognizer *)sender{ subLayer.frame=CGRectMake([sender locationInView:self.view].x-138, [sender locationInView:self.view].y-92, 276, 183); } in viedDidLoad i have: subLayer.backgroundColor=[UIColor whiteColor].CGColor; subLayer.frame=CGRectMake(22, 33, 276, 183); imageLayer.contents=(id)[UIImage imageNamed:@"A.jpg"].CGImage; imageLayer.frame=subLayer.bounds; imageLayer.masksToBounds=YES; imageLayer.cornerRadius=15.0; subLayer.shadowColor=[UIColor blackColor].CGColor; subLayer.cornerRadius=15.0; subLayer.shadowOpacity=0.8; subLayer.shadowOffset=CGSizeMake(0, 3); [subLayer addSublayer:imageLayer]; [self.view.layer addSublayer:subLayer]; It is giving desired output but a bit slow in the simulator. I have not yet tested it in Device. so my question is - Is it OK to move a CALayer containing an image??

    Read the article

  • Leak in managedObjectContext save:

    - by Kamchatka
    Hi I have the following piece of code and when I use Instruments/Object Allocations, it tells me that there is a leak there (which goes down to sqlite3MemMalloc). Is there something that I should release? if (![managedObjectContext save:&error]) { NSLog(@"Error while saving."); } The save works well and doesn't trigger an error.

    Read the article

  • Using finch first time. How to play mp3,ogg or other formats (wav files to big) ?

    - by Allisone
    My *.wav's work as expected. But wav files are to big, so I want to play *.mp3 or *.ogg but it doesn't work. I use this lines of code found in the finch Demo project engine = [[Finch alloc] init]; sitar = [[Sound alloc] initWithFile:RSRC(@"sitar.wav")]; [sitar play]; So I only change sitar.wav into my .mp3 filename. Note 1: It mustn't be mp3 or ogg, any file format not as huge as wav should be ok, but which ? Note 2: I didn't know how to use sound, so I searched and found finch here at stackoverflow. It looks easy, so I would like to use that, but if you know some other easy way to play that sound files (ambient + effects sound with compressed codec) I would also switch to that other technique.

    Read the article

  • CGGradient in an CGPath

    - by catlan
    Is it possible to draw a gradient in a path on the iPhone? I'm looking for a replacement of the mac os x method -drawInBezierPath:(NSBezierPath *)path relativeCenterPosition:(NSPoint)relativeCenterPosition of NSGradient.

    Read the article

  • Android - cant read TXT files from SDcard on real mashine?

    - by JustMe
    Hello! When I run the code bellow in the virtual android (1.5) it works well, TextSwitcher shows first 80 chars from each txt file from /sdcard/documents/ , but when I run it on my Samsung Galaxy i7500 (1.6) there are no contents in TextSwitcher, however in LogCat there are FileNames of txt files. My Code: public void getTxtFiles(){ //Scan /sdcard/documents and put .txt files in array File TxtFiles[] String path = Environment.getExternalStorageDirectory().toString()+"/documents/"; String files; File folder = new File(path); if(folder.exists()==false){if (!folder.mkdirs()) { Log.e("TAG", "Create dir in sdcard failed"); return; }} else{ File listOfFiles[] = folder.listFiles(); for (int i = 0; i < listOfFiles.length; i++) { if (listOfFiles[i].isFile()) { files = listOfFiles[i].getName(); if (files.endsWith(".txt") || files.endsWith(".TXT")) { if((files.length()-1)>i){resizeArray(TxtFiles, files.length()+10);} TxtFiles[i]=listOfFiles[i]; System.out.println(TxtFiles[i]); } } }} } private void updateCounter(int Pozicija) { if(Pozicija<0){Toast.makeText(getApplicationContext(), R.string.LastTxt, 5).show(); mCounter++;} else if(TxtFiles[mCounter]!=null){ TextToShow = getContents(TxtFiles[mCounter]); if(TextToShow.length()>80)TextToShow=TextToShow.substring(0, 80); mSwitcher.setText(TextToShow); System.out.println(Pozicija); } else mCounter--; } static public String getContents(File aFile) { //...checks on aFile are elided StringBuilder contents = new StringBuilder(); try { //use buffering, reading one line at a time //FileReader always assumes default encoding is OK! BufferedReader input = new BufferedReader(new FileReader(aFile)); try { String line = null; //not declared within while loop /* * readLine is a bit quirky : * it returns the content of a line MINUS the newline. * it returns null only for the END of the stream. * it returns an empty String if two newlines appear in a row. */ while (( line = input.readLine()) != null){ contents.append(line); contents.append(System.getProperty("line.separator")); } } finally { input.close(); } } catch (IOException ex){ ex.printStackTrace(); } return contents.toString(); } And I am able to write contents of those files though LogCat! Any ideas?

    Read the article

  • How to embed mp3 into .exe file & Play it?

    - by afriza
    I am used to embed WAV into .exe and Play it using PlaySound(). However, using this method causes the .exe to become pretty big. Is it possible to do the same with MP3 files and how to do it? I have taken a look at DirectShow but it seems to be able to play from files only? I am developing for Windows Mobile 6 Series

    Read the article

  • Java M4A atom tagging free space issue

    - by Brett
    Hey, I've been trying to be able to read and write iTunes style M4A atoms and while I've successfully done the reading part, I've come to a bit of a halt in regards to the free space atoms. I figured that I should be able edit and shift the padding around to accommodate writing an atom with more data than it originally had. I've been stuck on this for about a day now, and I've been trying to figure out how to determine the closest free space atom with enough size to accommodate the new data. so far I have: private freeAtom acquireFreeSpaceAtom( long position ) { long atomStart = Long.MAX_VALUE; freeAtom atom = null; for( freeAtom a : freeSpace ) { if( Math.abs( position - atomStart ) > Math.abs( position - a.getAtomStart() ) ) atomStart = ( atom = a ).getAtomStart(); } return atom; } That code only takes into account the closest free space atom and completely disregards the fact that it should be greater than or equal to a certain size, but I can't quite figure out how I should check for both closeness and size efficiently.

    Read the article

  • Is there an easy way to stream a m3u in iPhone?

    - by marty
    I can have a UIWebView with the .m3u file opened, which will go to the webview with a play button displayed, and that automatically goes to the quicktime player and starts playing the stream. But when I press the done button, it goes back to the UIWebView with a little play button in the middle, and from there you can go back to the previous screen (it was selected from a tableview). So I just want it to automatically load the quicktime player in the view. How can I do that?

    Read the article

  • Is it possible to programmatically edit a sound file based on frequency?

    - by K-RAN
    Just wondering if it's possible to go through a flac, mp3, wav, etc file and edit portions, or the entire file by removing sections based on a specific frequency range? So for example, I have a recording of a friend reciting a poem with a few percussion instruments in the background. Could I write a C program that goes through the entire file and cuts out everything except the vocals (human voice frequency ranges from 85-255 Hz, from what I've been reading)? Thanks in advance for any ideas!

    Read the article

< Previous Page | 112 113 114 115 116 117 118 119 120 121 122 123  | Next Page >