Search Results

Search found 4165 results on 167 pages for 'pulse audio'.

Page 59/167 | < Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >

  • How to change default audio input device programatically

    - by f34r
    I am looking for a way to set/change default input device inside my application. I have several different recording devices and it is very anoying to go into the control panel and change default recording device. I was looking around and I did not find anything that could help me with the problem. Application is written in c# and it is targeted for Windows Vista / Windows 7.

    Read the article

  • Synchronizing Java Visualizer Audio and Visual

    - by Matt
    I've run into a problem creating a visualizer for .mp3 files in Java. My goal is to create a visualization that runs in time with the .mp3 file being played. I can currently visualize an .mp3 OR play it, but not both at the same time. I am using libraries which may make this trickier than necessary. I currently: Read in the .mp3 as a FileInputStream. a) Convert the FileInputStream into a Bitstream and run the Visualizer OR b) Pass the FileInputStream to a library Play method where it converts it into a Bitstream, decodes it, and plays it. I am using the JLayer library to play and decode the .mp3. My question is: how do I synchronize the two actions so that I can run both at the same time AND they line up (so my visualizations correspond to the changing frequencies). This implies that they finish at the same time as well.

    Read the article

  • AxWindowsMediaPlayer does not play audio/video from url ?

    - by Madhup
    HI, I am using activeXMediaPlayer to play files from url but each time I pass a url to it shows the message , "either the file is corrupted or player does not support the file format u are playing." But when i run the same url on browser the file is downloaded and this downloaded file can be played on the media player. I am not able to find out what the problem is . Because the same cod plays the local file and the downloaded files but not file from url Although the same code worked few months ago for the urls So is this my fault or some server related issues can affect this thing. Please help me I am in big trouble. Regards, Madhup

    Read the article

  • How to release audio properly? (AVAudioPlayer)

    - by Aluminum
    Hello everyone! I need help with my iOS application ^^,. I want to know if I'm releasing AVAudioPlayer correctly. MyViewController.h #import <UIKit/UIKit.h> @interface MyViewController : UIViewController { NSString *Path; } - (IBAction)Playsound; @end MyViewController.m #import <AVFoundation/AVAudioPlayer.h> #import "MyViewController.h" @implementation MyViewController AVAudioPlayer *Media; - (IBAction)Playsound { Path = [[NSBundle mainBundle] pathForResource:@"Sound" ofType:@"wav"]; Media = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:Path] error:NULL]; [Media play]; } - (void)dealloc { [Media release]; [super viewDidUnload]; } @end

    Read the article

  • Non intrusive notification without audio?

    - by acidzombie24
    i have a C# app that registers a protocol. When you click BLAH://djfhgjfdghjkd in a browser it launches my app. However you can click multiple links and each link is a note added into the app. How can i inform the user that he did fully click the link? Right now i have a console app showing up for 1sec (basically pops up and goes away as fast as possible) which felt better then a hidden console since you are unsure if it went through. The 1 second takes a lot of time when you are trying to rapidly click many notes/links and the console gets in the way. What can i do that is noticeable? I'm thinking have a box that comes up (and is semi transparent) but the click passes through it. Maybe there is a better way? Also i wouldnt know where to start with transparent windows or pass through clicks

    Read the article

  • absolute audio synchronization

    - by user1780526
    I would like to synchronize my computer with an external camcorder recording so that I can know exactly (to the millisecond) when certain recored events happen with respect to other sensors logged by the computer. One idea is to playback short sound pulses or chirps every second from the computer that get picked up by the microphone on the camcorder. But the accuracy of a simple cron job playing a sound clip is not precise enough. I was thinking of using something like gstreamer, but how does one get it to playback a clip at precisely a certain time according to the system clock?

    Read the article

  • aplay -l says no soundcards found; alsaconf says no supported cords; yet /proc/asound contains cards

    - by nimasmi
    I am trying to get HDMI output using a Gainward Nvidia 210 512 MB on Ubuntu 10.04 Lucid Lynx. I have upgraded alsa-driver, alsa-lib and alsa-utils to 1.0.24 by building from source, thanks to this blog post. Some relevant output... user@box:~$ lspci | grep Audio 00:05.0 Audio device: nVidia Corporation MCP61 High Definition Audio (rev a2) 01:09.0 Multimedia video controller: Conexant Systems, Inc. CX23880/1/2/3 PCI Video and Audio Decoder (rev 05) 01:09.2 Multimedia controller: Conexant Systems, Inc. CX23880/1/2/3 PCI Video and Audio Decoder [MPEG Port] (rev 05) 01:09.4 Multimedia controller: Conexant Systems, Inc. CX23880/1/2/3 PCI Video and Audio Decoder [IR Port] (rev 05) 02:00.1 Audio device: nVidia Corporation High Definition Audio Controller (rev a1) user@box:~$ cat /proc/asound/version Advanced Linux Sound Architecture Driver Version 1.0.24. Compiled on Sep 15 2012 for kernel 2.6.32-42-generic (SMP). user@box:~$ ls /proc/asound` card0 cards hwdep NVidia oss seq version card1 devices modules NVidia_1 pcm timers user@box:~$ aplay -l aplay: device_list:240: no soundcards found... user@box:~$ sudo /sbin/alsa-utils start * Setting up ALSA... * warning: 'alsactl restore' failed with error message 'alsactl: set_control:1403: Cannot write control '2:0:0:IEC958 Playback Default:0' : Operation not permitted'... amixer: Invalid command! ...done. Any help appreciated. PS my video card is connected only through the PCI-E slot. I assume there is no extra audio connection required.

    Read the article

  • NAudio demos not working anymore

    - by Kurru
    I just tried to run the NAudio demos and I'm getting a weird error: System.BadImageFormatException: Could not load file or a ssembly 'NAudio, Version=1.3.8.0, Culture=neutral, PublicKeyToken=null' or one o f its dependencies. An attempt was made to load a program with an incorrect form at. File name: 'NAudio, Version=1.3.8.0, Culture=neutral, PublicKeyToken=null' at NAudioWpfDemo.AudioGraph..ctor() at NAudioWpfDemo.ControlPanelViewModel..ctor(IWaveFormRenderer waveFormRender er, SpectrumAnalyser analyzer) in C:\Users\Admin\Downloads\NAudio-1.3\NAudio-1-3 \Source Code\NAudioWpfDemo\ControlPanelViewModel.cs:line 23 at NAudioWpfDemo.MainWindow..ctor() in C:\Users\Admin\Downloads\NAudio-1.3\NA udio-1-3\Source Code\NAudioWpfDemo\MainWindow.xaml.cs:line 15 WRN: Assembly binding logging is turned OFF. To enable assembly bind failure logging, set the registry value [HKLM\Software\M icrosoft\Fusion!EnableLog] (DWORD) to 1. Note: There is some performance penalty associated with assembly bind failure lo gging. To turn this feature off, remove the registry value [HKLM\Software\Microsoft\Fus ion!EnableLog]. Since the last time I used NAudio demos I have changed from 32bit Windows XP to 64bit Windows 7. Would this cause this issue? Its very annoying as I was about to try my hand at audio in C# again

    Read the article

  • Silverlight MediaElement Position Property Weirdness

    - by BarrettJ
    I have a MediaElement that is reporting its position incorrectly and weirdly, but consistently. It seems like when it gets to the last second of the audio (and it's always the last second, regardless if the sound is two seconds or 10), it doesn't update it's position until it finishes. Example output: Playback Progress: 0/3.99 - 0 Playback Progress: 0.01/3.99 - 0 Playback Progress: 0.03/3.99 - 0 Playback Progress: 0.06/3.99 - 1 Playback Progress: 0.07/3.99 - 1 Playback Progress: 0.08/3.99 - 2 Playback Progress: 0.11/3.99 - 2 Playback Progress: 0.14/3.99 - 3 Playback Progress: 0.19/3.99 - 4 Playback Progress: 0.23/3.99 - 5 Playback Progress: 0.25/3.99 - 6 Playback Progress: 0.28/3.99 - 7 Playback Progress: 0.3/3.99 - 7 Playback [SNIP] Playback Progress: 2.8/3.99 - 70 Playback Progress: 2.83/3.99 - 70 Playback Progress: 2.88/3.99 - 72 Playback Progress: 2.9/3.99 - 72 Playback Progress: 2.91/3.99 - 72 Playback Progress: 2.92/3.99 - 73 Playback Progress: 2.99/3.99 - 74 Playback Progress: 3/3.99 - 75 Playback Progress: 3/3.99 - 75 Playback Progress: 3/3.99 - 75 Playback Progress: 3/3.99 - 75 Playback Progress: 3/3.99 - 75 Playback Progress: 3/3.99 - 75 Playback Progress: 3/3.99 - 75 Playback Progress: 3/3.99 - 75 Playback Progress: 3/3.99 - 75 Playback Progress: 3.99/3.99 - 100 That is the result of: WriteLine("Playback Progress: " + Position + "/" + LengthInSeconds + " - " + (int)((Position / LengthInSeconds) * 100)); public double Position { get { return my_media_element != null ? my_media_element.Position.TotalSeconds : 0; } } public double LengthInSeconds { get { return my_media_element != null ? my_media_element.NaturalDuration.TimeSpan.TotalSeconds : 0; } } Anyone have any ideas why this is occurring?

    Read the article

  • How to get a volume measurement of iPhone recording in dB, with a limit of at least 120dB

    - by Cyber
    Hello, I am trying to make a simple volume meter for the iPhone. I want the volume displayed in dB. When using this turorial, I am only getting measurements up to 78 dB. I've read that that is because the dBFS spectrum for 16 bit audio recordings is only 96 dB. I tried modifying this piece of code in the init funcyion: dataFormat.mSampleRate = 44100.0f; dataFormat.mFormatID = kAudioFormatLinearPCM; dataFormat.mFramesPerPacket = 1; dataFormat.mChannelsPerFrame = 1; dataFormat.mBytesPerFrame = 2; dataFormat.mBytesPerPacket = 2; dataFormat.mBitsPerChannel = 16; dataFormat.mReserved = 0; I changed the value of mBitsPerChannel, hoping to increase the bit value of the recording. dataFormat.mBitsPerChannel = 32; With that variable set to 32, the "mAveragePower" function returns only 0. So, how can i measure more decibels? All my code is practically the same as in the tutorial i posted above. Thanks in advance, Thomas

    Read the article

  • Can't play wav file from Javascript in Firefox for Mac

    - by Mike Royle
    I have the following html file that plays a wav file when the user hovers over the 'Play' anchor tag. It works perfectly on IE, Chrome, Firefox, Opera, Safari on both Windows and Mac - except for Firefox on the Mac which does not play the file. <html> <head> <title></title> <script> function PlayAudio() { var s = document.getElementById("soundFile"); s.Play(); } </script> </head> <body> <embed src="MySound.wav" enablejavascript="true" type="audio/wav" autostart="false" width="0" height="0" id="soundFile" /> <a href="#" onmouseover="PlayAudio()">Play</a> </body> </html> If the autostart attribute of the embed tag is set to true then the wav file plays as expected in Firefox for Mac, but not on the mouseover of the anchor tag. Any ideas?

    Read the article

  • Debug NAudio MP3 reading difference?

    - by Conrad Albrecht
    My code using NAudio to read one particular MP3 gets different results than several other commercial apps. Specifically: My NAudio-based code finds ~1.4 sec of silence at the beginning of this MP3 before "audible audio" (a drum pickup) starts, whereas other apps (Windows Media Player, RealPlayer, WavePad) show ~2.5 sec of silence before that same drum pickup. The particular MP3 is "Like A Rolling Stone" downloaded from Amazon.com. Tested several other MP3s and none show any similar difference between my code and other apps. Most MP3s don't start with such a long silence so I suspect that's the source of the difference. Debugging problems: I can't actually find a way to even prove that the other apps are right and NAudio/me is wrong, i.e. to compare block-by-block my code's results to a "known good reference implementation"; therefore I can't even precisely define the "error" I need to debug. Since my code reads thousands of samples during those 1.4 sec with no obvious errors, I can't think how to narrow down where/when in the input stream to look for a bug. The heart of the NAudio code is a P/Invoke call to acmStreamConvert(), which is a Windows "black box" call which I can't think how to error-check. Can anyone think of any tricks/techniques to debug this?

    Read the article

  • How can I implement a volume meter for a song currently playing? (iPhone OS 3.1.3)

    - by Adam
    Hi i'm very new to core audio and I just would like some help in coding up a little volume meter for whatever's being outputted through headphones or built-in speaker. Like a dB meter. I have the following code, and have been trying to go through the apple source project "SpeakHere", but it's a nightmare trying to go through all that, without knowing how it works first... Could anyone shed some light? Here's the code I have so far... (void)displayWaveForm { while (musicIsPlaying == YES { NSLog(@"%f",sizeof(AudioQueueLevelMeterState)); } } (IBAction)playMusic { if (musicIsPlaying == NO) { NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@/track7.wav",[[NSBundle mainBundle] resourcePath]]]; NSError *error; music = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error]; music.numberOfLoops = -1; music.volume = 0.5; [music play]; musicIsPlaying = YES; [self displayWaveForm]; } else { [music pause]; musicIsPlaying = NO; } }

    Read the article

  • How would i down-sample a .wav file then reconstruct it using nyquist? - in MATLAB

    - by Andrew
    This is all done in MATLAB 2010 My objective is to show the results of: undersampling, nyquist rate/ oversampling First i need to downsample the .wav file to get an incomplete/ or impartial data stream that i can then reconstuct. Heres the flow chart of what im going to be doing So the flow is analog signal - sampling analog filter - ADC - resample down - resample up - DAC - reconstruction analog filter what needs to be achieved: F= Frequency F(Hz=1/s) E.x. 100Hz = 1000 (Cyc/sec) F(s)= 1/(2f) Example problem: 1000 hz = Highest frequency 1/2(1000hz) = 1/2000 = 5x10(-3) sec/cyc or a sampling rate of 5ms This is my first signal processing project using matlab. what i have so far. % Fs = frequency sampled (44100hz or the sampling frequency of a cd) [test,fs]=wavread('test.wav'); % loads the .wav file left=test(:,1); % Plot of the .wav signal time vs. strength time=(1/44100)*length(left); t=linspace(0,time,length(left)); plot(t,left) xlabel('time (sec)'); ylabel('relative signal strength') **%this is were i would need to sample it at the different frequecys (both above and below and at) nyquist frequency.*I think.*** soundsc(left,fs) % shows the resaultant audio file , which is the same as original ( only at or above nyquist frequency however) Can anyone tell me how to make it better, and how to do the sampling at verious frequencies?

    Read the article

  • OpenAL not playing on Max OS X 10.6

    - by Grimless
    I've been working on getting a basic audio engine running on my Mac using OpenAL. It seems relatively straightforward after working with OpenGL for a while. However, despite the fact that I believe I have everything in place, my sound will not play. Here is the order of things I am doing: //Creating a new device ALCdevice* device = alcOpenDevice(NULL); //Create a new context with the device ALCcontext* context = alcCreateContext(device, NULL); //Make that context current alcMakeContextCurrent(context); //Do lots of loading stuff to bring in an AIFF... voodooAIFF = myAIFFLoader("name"); //Then use that data ALuint buf; alGenBuffers(1, &buf); //Check for errors, but none happen... //Bind buffer data. alBufferData(buf, voodooAIFF.format, voodooAIFF.data, voodooAIFF.sizeInBytes, voodooAIFF.frequency); //Check for errors, none here either... //Create Source ALuint src; alGenSources(1, &src); //Error check again, no errors. //Bind source to buffer alSourcei(src, AL_BUFFER, buf); //Set reference distance alSourcei(sourceID, AL_REFERENCE_DISTANCE, 1); //Set source attributes including gain and pitch to 1 (direction set to 0,0,0) //Check for errors, nothing... //Set up listener attributes. //Check for errors, no errors. //Begin playing. alSourcePlay(src); Observe silence... Any insight, what steps am I missing here?

    Read the article

  • Difficulty porting raw PCM output code from Java to Android AudioTrack API.

    - by IndigoParadox
    I'm attempting to port an application that plays chiptunes (NSF, SPC, etc) music files from Java SE to Android. The Android API seems to lack the javax multimedia classes that this application uses to output raw PCM audio. The closest analog I've found in the API is AudioTrack and so I've been wrestling with that. However, when I try to run one of my sample music files through my port-in-progress, all I get back is static. My suspicion is that it's the AudioTrack I've setup which is at fault. I've tried various different constructors but it all just outputs static in the end. The DataLine setup in the original code is something like: AudioFormat audioFormat = new AudioFormat( AudioFormat.Encoding.PCM_SIGNED, 44100, 16, 2, 4, 44100, true ); DataLine.Info lineInfo = new DataLine.Info( SourceDataLine.class, audioFormat ); DataLine line = (SourceDataLine)AudioSystem.getLine( lineInfo ); The constructor I'm using right now is: AudioTrack = new AudioTrack( AudioManager.STREAM_MUSIC, 44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO, AudioFormat.ENCODING_PCM_16BIT, AudioTrack.getMinBufferSize( 44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO, AudioFormat.ENCODING_PCM_16BIT ), AudioTrack.MODE_STREAM ); I've replaced constants and variables in those so they make sense as concisely as possible, but my basic question is if there are any obvious problems in the assumptions I made when going from one format to the other.

    Read the article

  • Is there a best practice for concatenating MP3 Files, adjusting sample rates to match, while preserving original files?

    - by Scott
    Hello overflow community! Does anyone know if there is a "best practice" to concatenate mp3 files to create new files, while preserving the original files? I am working on a CentOS Linux machine, in command line. I will eventually call the command line from a PHP script. I have been doing research and I have come up with a process that I think could work. It combines general advice from different forums, blogs, and sources like this one. So here I go: Create a temporary folder Loop through files to create a new, converted copy, of file into a "raw" format (which one, I don't know. I didn't know "raw" files existed before too long ago. I could use some suggestions on this) Store the path to the temporary files, in the temporary folder, and then loop through the files to concatenate them and then put the new merged file the final "processed directory" Delete the contents of the temporary file with the temporary raw files inside. Convert the final file from "raw" to mp3 and enjoy the finished result I'm thinking that this course of action might be best because I can't necessarily control the quality of the original "source" mp3s. The only other option I could think of would be to create a script that would perform a similar process upon files being added to the system leaving only the files with the "proper" format and removing the original "erroneous" file. Hopefully you can see that I have put some thought into this and that I'm trying to leverage the collective knowledge of this community to choose the best direction. Perhaps there is a better path that I could take? By concatenate, I mean to join together in sequence to create a new audio file from the "concatenated files."

    Read the article

  • To display an album art from media store in android

    - by user1834724
    I'm not able to display album art from media store while listing albums,I'm getting following error Bad request for field slot 0,-1. numRows = 32, numColumns = 7 01-02 02:48:16.789: D/AndroidRuntime(4963): Shutting down VM 01-02 02:48:16.789: W/dalvikvm(4963): threadid=1: thread exiting with uncaught exception (group=0x4001e578) 01-02 02:48:16.804: E/AndroidRuntime(4963): FATAL EXCEPTION: main 01-02 02:48:16.804: E/AndroidRuntime(4963): java.lang.IllegalStateException: get field slot from row 0 col -1 failed Can anyone kindly help with this issue,Thanks in advance public class AlbumbsListActivity extends Activity { private ListAdapter albumListAdapter; private HashMap<Integer, Integer> albumInfo; private HashMap<Integer, Integer> albumListInfo; private HashMap<Integer, String> albumListTitleInfo; private String audioMediaId; private static final String TAG = "AlbumsListActivity"; Boolean showAlbumList = false; Boolean AlbumListTitle = false; ImageView album_art ; public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.albums_list_layout); Cursor cursor; ContentResolver cr = getApplicationContext().getContentResolver(); if (getIntent().hasExtra(Util.ALBUM_ID)) { int albumId = getIntent().getIntExtra(Util.ALBUM_ID, Util.MINUS_ONE); String[] projection = new String[] { Albums._ID, Albums.ALBUM, Albums.ARTIST, Albums.ALBUM_ART, Albums.NUMBER_OF_SONGS }; String selection = null; String[] selectionArgs = null; String sortOrder = Media.ALBUM + " ASC"; cursor = cr.query(Albums.EXTERNAL_CONTENT_URI, projection, selection, selectionArgs, sortOrder); /* final String[] ccols = new String[] { //MediaStore.Audio.Albums., MediaStore.Audio.Albums._ID, MediaStore.Audio.Albums.ALBUM, MediaStore.Audio.Albums.ARTIST, MediaStore.Audio.Albums.ALBUM_ART, MediaStore.Audio.Albums.NUMBER_OF_SONGS }; cursor = cr.query(MediaStore.Audio.Albums.getContentUri( "external"), ccols, null, null, MediaStore.Audio.Albums.DEFAULT_SORT_ORDER);*/ showAlbumList = true; } else { String order = MediaStore.Audio.Albums.ALBUM + " ASC"; String where = MediaStore.Audio.Albums.ALBUM; cursor = managedQuery(Media.EXTERNAL_CONTENT_URI, DbUtil.projection, null, null, order); showAlbumList = false; } albumInfo = new HashMap<Integer, Integer>(); albumListInfo = new HashMap<Integer, Integer>(); ListView listView = (ListView) findViewById(R.id.mylist_album); listView.setFastScrollEnabled(true); listView.setOnItemLongClickListener(new ItemLongClickListener()); listView.setAdapter(new AlbumCursorAdapter(this, cursor, DbUtil.displayFields, DbUtil.displayViews,showAlbumList)); final Uri uri = MediaStore.Audio.Albums.EXTERNAL_CONTENT_URI; final Cursor albumListCursor = cr.query(uri, DbUtil.Albumprojection, null, null, null); } private class AlbumCursorAdapter extends SimpleCursorAdapter implements SectionIndexer{ private final Context context; private final Cursor cursorValues; private Time musicTime; private Boolean isAlbumList; private MusicAlphabetIndexer mIndexer; private int mTitleIdx; public AlbumCursorAdapter(Context context, Cursor cursor, String[] from, int[] to,Boolean isAlbumList) { super(context, 0, cursor, from, to); this.context = context; this.cursorValues = cursor; //musicTime = new Time(); this.isAlbumList = isAlbumList; } String albumName=""; String artistName = ""; String numberofsongs = ""; long albumid; @Override public View getView(int position, View convertView, ViewGroup parent) { View rowView = convertView; if (rowView == null) { LayoutInflater inflater = (LayoutInflater) context .getSystemService(Context.LAYOUT_INFLATER_SERVICE); rowView = inflater .inflate(R.layout.row_album_layout, parent, false); } this.cursorValues.moveToPosition(position); String title = ""; String artistName = ""; String albumName = ""; int count; long albumid = 0; String songDuration = ""; if (isAlbumList) { albumInfo.put( position, Integer.parseInt(this.cursorValues.getString(this.cursorValues .getColumnIndex(MediaStore.Audio.Albums._ID)))); artistName = this.cursorValues .getString(this.cursorValues .getColumnIndex(MediaStore.Audio.Albums.ARTIST)); albumName = this.cursorValues .getString(this.cursorValues .getColumnIndex(MediaStore.Audio.Albums.ALBUM)); albumid=Integer.parseInt(this.cursorValues.getString(this.cursorValues .getColumnIndex(MediaStore.Audio.Albums.ALBUM_ID))); } else { albumInfo.put(position, Integer.parseInt(this.cursorValues .getString(this.cursorValues .getColumnIndex(MediaStore.Audio.Media._ID)))); artistName = this.cursorValues.getString(this.cursorValues .getColumnIndex(MediaStore.Audio.Media.ARTIST)); albumName = this.cursorValues.getString(this.cursorValues .getColumnIndex(MediaStore.Audio.Media.ALBUM)); albumid=Integer.parseInt(this.cursorValues.getString(this.cursorValues .getColumnIndex(MediaStore.Audio.Media.ALBUM_ID))); } //code for Alphabetical Indexer mTitleIdx = cursorValues.getColumnIndex(MediaStore.Audio.Media.ALBUM); mIndexer = new MusicAlphabetIndexer(cursorValues, mTitleIdx, getResources().getString(R.string.fast_scroll_alphabet)); //end TextView metaone = (TextView) rowView.findViewById(R.id.album_name); TextView metatwo = (TextView) rowView.findViewById(R.id.artist_name); ImageView metafour = (ImageView) rowView.findViewById(R.id.album_art); TextView metathree = (TextView) rowView .findViewById(R.id.songs_count); metaone.setText(albumName); metatwo.setText(artistName); (metafour)getAlbumArt(albumid); System.out.println("albumid----------"+albumid); metaThree.setText(DbUtil.makeTimeString(context, secs)); getAlbumArt(albumid); } TextView metaone = (TextView) rowView.findViewById(R.id.album_name); TextView metatwo = (TextView) rowView.findViewById(R.id.artist_name); album_art = (ImageView) rowView.findViewById(R.id.album_art); //TextView metathree = (TextView) rowView.findViewById(R.id.songs_count); metaone.setText(albumName); metatwo.setText(artistName); return rowView; } } String albumArtUri = ""; private void getAlbumArt(long albumid) { Uri uri=ContentUris.withAppendedId(MediaStore.Audio.Albums.EXTERNAL_CONTENT_URI, albumid); System.out.println("hhhhhhhhhhh" + uri); Cursor cursor = getContentResolver().query( ContentUris.withAppendedId( MediaStore.Audio.Albums.EXTERNAL_CONTENT_URI, albumid), new String[] { MediaStore.Audio.AlbumColumns.ALBUM_ART }, null, null, null); if (cursor.moveToFirst()) { albumArtUri = cursor.getString(0); } System.out.println("kkkkkkkkkkkkkkkkkkk :" + albumArtUri); cursor.close(); if(albumArtUri != null){ Options opts = new Options(); opts.inJustDecodeBounds = true; Bitmap albumCoverBitmap = BitmapFactory.decodeFile(albumArtUri, opts); opts.inJustDecodeBounds = false; albumCoverBitmap = BitmapFactory.decodeFile(albumArtUri, opts); if(albumCoverBitmap != null) album_art.setImageBitmap(albumCoverBitmap); }else { // TODO: Options opts = new Options(); Bitmap albumCoverBitmap = BitmapFactory.decodeResource(getApplicationContext().getResources(), R.drawable.albumart_mp_unknown_list, opts); if(albumCoverBitmap != null) album_art.setImageBitmap(albumCoverBitmap); } } } }

    Read the article

< Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >