Search Results

Search found 180 results on 8 pages for 'sdcard'.

Page 3/8 | < Previous Page | 1 2 3 4 5 6 7 8  | Next Page >

  • Construct MediaStore uri for specific folder

    - by Mojo Risin
    Hi all, For example if a have two directories /sdcard/Music/Music-1 and /sdcard/Music/Music-2 how can i construct uri to get the files in Music-1 dir for example. I can use MediaStore.Audio.Media.EXTERNAL_CONTENT_URI to get the content of all external storage but how to do the trick only for specific dir.

    Read the article

  • How to load jni from sd card on android 2.1?

    - by user263423
    I want to load third-party jni library in runtime. I've tried to load directly from sdcard. It expectedly failed. I've tried to copy library from sdcard to /data/data/app/ and then System.load(/data/data/libjni.so) It works on HTC HERO, but fails on HTC Legend with Android 2.1. it fails during execution of native code and write to log uninformative stack trace Any other way to do it?

    Read the article

  • how to send image to remote server using webservices in android only save to byte array

    - by satyamurthy
    get image from sdcard and store that image to remote server. i am getting the image from sdcard and i converterd that image to bytearray by using bitmap .but what's the problem if i oberver byte array it is showing some different values it is not matching with .net image byte array conversion. can u pl help if you have any solution it is very urgent to me following is the code i am using can u pl suggest me FileInputStream fin = new FileInputStream(new File("/sdcard/pictures/1.jpg")); BufferedInputStream bis = new BufferedInputStream(fin,3000); byte[] data = new byte[bis.available()]; bis.read(data, 0, data.length); byte[] data1=new byte[data.length]; for (int i = 0; i < data.length; i++) { System.out.print(data[i]); data1[i]=data[i]; } System.out.println("5..................."+data1); Bitmap bitmap = BitmapFactory.decodeByteArray(data1,0,data1.length); System.out.println("6..................."+data1.length); Log.v("hgfjohfjghjdfhgj",""+bitmap); if(bitmap!=null) image.setImageBitmap(bitmap); else Log.e("Bitmap "," Not Created");

    Read the article

  • how to send image to remote server using web services in android

    - by Aswan
    get image from sdcard and store that image to remote server. i am getting the image from sdcard and i converterd that image to bytearray by using bitmap .but what's the problem if i oberver byte array it is showing some different values it is not matching with .net image byte array conversion. can u pl help if you have any solution it is very urgent to me following is the code i am using can u pl suggest me FileInputStream fin = new FileInputStream(new File("/sdcard/pictures/1.jpg")); BufferedInputStream bis = new BufferedInputStream(fin,3000); byte[] data = new byte[bis.available()]; bis.read(data, 0, data.length); byte[] data1=new byte[data.length]; for (int i = 0; i < data.length; i++) { System.out.print(data[i]); data1[i]=data[i]; } System.out.println("5..................."+data1); Bitmap bitmap = BitmapFactory.decodeByteArray(data1,0,data1.length); System.out.println("6..................."+data1.length); Log.v("hgfjohfjghjdfhgj",""+bitmap); if(bitmap!=null) image.setImageBitmap(bitmap); else Log.e("Bitmap "," Not Created");

    Read the article

  • UNIX - mount: only root can do that

    - by Travesty3
    I need to allow a non-root user to mount/unmount a device. I am a total noob when it comes to UNIX, so please dumb it down for me. I've been looking all over teh interwebz to find an answer and it seems everyone is giving the same one, which is to modify /etc/fstab to include that device with the 'user' option (or 'users', tried both). Cool, well I did that and it still says "mount: only root can do that". Here are the contents of my fstab: # /etc/fstab: static file system information. # # Use 'vol_id --uuid' to print the universally unique identifier for a # device; this may be used with UUID= as a more robust way to name devices # that works even if disks are added and removed. See fstab(5). # # proc /proc proc defaults 0 0 # / was on /dev/mapper/minicc-root during installation UUID=1a69f02a-a049-4411-8c57-ff4ebd8bb933 / ext3 relatime,errors=remount-ro 0 1 # /boot was on /dev/sda5 during installation UUID=038498fe-1267-44c4-8788-e1354d71faf5 /boot ext2 relatime 0 2 # swap was on /dev/mapper/minicc-swap_1 during installation UUID=0bb583aa-84a8-43ef-98c4-c6cb25d20715 none swap sw 0 0 /dev/scd0 /media/cdrom0 udf,iso9660 user,noauto,exec,utf8 0 0 /dev/scd0 /media/floppy0 auto rw,user,noauto,exec,utf8 0 0 /dev/sdb1 /mnt/sdcard auto auto,user,rw,exec 0 0 My thumb drive partition shows up as /dev/sdb1. I'm pretty sure my fstab is set up OK, but everyone on the other posts seems to fail to mention how they actually call the 'mount' command once this entry is in the fstab file. I think this is where my problem may be. The command I use to mount the drive is: $ mount /dev/sdb1 /mnt/sdcard. /bin/mount is owned by root and is in the root group and has 4755 permissions. /bin/umount is owned by root and is in the root group and has 4755 permissions. /mnt/sdcard is owned by me and is in one of my groups and has 0755 permissions. My mount command works fine if I use sudo, but I need to be able to do this without sudo (need to be able to do it from a PHP script using shell_exec). Any suggestions? Sorry for making you read so much...just trying to get as much info in the initial post as possible to preemptively answer questions about configuration stuff. If I missed anything tho, ask away. Thanks! -Travis

    Read the article

  • ffmpeg add two audio streams to video

    - by Tossin Hausen
    I tried this: ffmpeg -i /sdcard/video/transcode/video.avi -map 0:0,0 -i /sdcard/video/transcode/first.mp3 -map 1:0,1 -i /sdcard/video/transcode/second.mp3 -map 2:0,2 -acodec copy -vcodec py /sdcard/video/transcode/Output.avi to add two audio streams to one video file. But ffmpeg says the number of mappings should match the number of output streams. What is wrong here? I'm trying to work with an Android build of FFmepg "ffmpeg for android beta". "Does not work" means that this uncommunicative Android build of FFmpeg just stops without giving any error message. The -codec copy option does not work with this build. Now I tried the same set of files with the FFmpeg called command line tool that comes with Ubuntu 10. Something (can't say where it is from). The -codec copy option does not work with this FFmpeg too. Here the complete output: m30x:~/movie/Film$ ffmpeg -i input.avi -i first.mp3 -i second.mp3 -map 0 -map 1 -map 2 -acodec copy -vcodec copy output.avi FFmpeg version SVN-r0.5.9-4:0.5.9-0ubuntu0.10.04.1, Copyright (c) 2000-2009 Fabrice Bellard, et al. configuration: --extra-version=4:0.5.9-0ubuntu0.10.04.1 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libgsm --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --disable-stripping --disable-vhook --enable-runtime-cpudetect --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libdc1394 --enable-shared --disable-static libavutil 49.15. 0 / 49.15. 0 libavcodec 52.20. 1 / 52.20. 1 libavformat 52.31. 0 / 52.31. 0 libavdevice 52. 1. 0 / 52. 1. 0 libavfilter 0. 4. 0 / 0. 4. 0 libswscale 0. 7. 1 / 0. 7. 1 libpostproc 51. 2. 0 / 51. 2. 0 built on Jun 12 2012 16:27:34, gcc: 4.4.3 [NULL @ 0x93cfd10]looks like this file was encoded with (divx4/(old)xvid/opendivx) -> forcing low_delay flag Seems stream 0 codec frame rate differs from container frame rate: 30000.00 (30000/1) -> 25.00 (25/1) Input #0, avi, from 'input.avi': Duration: 01:30:33.00, start: 0.000000, bitrate: 901 kb/s Stream #0.0: Video: mpeg4, yuv420p, 576x432, 25 tbr, 25 tbn, 30k tbc Input #1, mp3, from 'first.mp3': Duration: 01:30:32.84, start: 0.000000, bitrate: 63 kb/s Stream #1.0: Audio: mp3, 22050 Hz, stereo, s16, 64 kb/s Input #2, mp3, from 'second.mp3': Duration: 01:30:32.84, start: 0.000000, bitrate: 63 kb/s Stream #2.0: Audio: mp3, 22050 Hz, stereo, s16, 64 kb/s Number of stream maps must match number of output streams Merging only one audio stream with the video stream works with Ubuntu and Android version of FFmpeg. Here the complete output: ffmpeg -i input.avi -i first.mp3 -map 0 -map 1 -acodec copy -vcodec copy output.avi FFmpeg version SVN-r0.5.9-4:0.5.9-0ubuntu0.10.04.1, Copyright (c) 2000-2009 Fabrice Bellard, et al. configuration: --extra-version=4:0.5.9-0ubuntu0.10.04.1 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libgsm --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --disable-stripping --disable-vhook --enable-runtime-cpudetect --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libdc1394 --enable-shared --disable-static libavutil 49.15. 0 / 49.15. 0 libavcodec 52.20. 1 / 52.20. 1 libavformat 52.31. 0 / 52.31. 0 libavdevice 52. 1. 0 / 52. 1. 0 libavfilter 0. 4. 0 / 0. 4. 0 libswscale 0. 7. 1 / 0. 7. 1 libpostproc 51. 2. 0 / 51. 2. 0 built on Jun 12 2012 16:27:34, gcc: 4.4.3 [NULL @ 0x9bfad10]looks like this file was encoded with (divx4/(old)xvid/opendivx) -> forcing low_delay flag Seems stream 0 codec frame rate differs from container frame rate: 30000.00 (30000/1) -> 25.00 (25/1) Input #0, avi, from 'input.avi': Duration: 01:30:33.00, start: 0.000000, bitrate: 901 kb/s Stream #0.0: Video: mpeg4, yuv420p, 576x432, 25 tbr, 25 tbn, 30k tbc Input #1, mp3, from 'first.mp3': Duration: 01:30:32.84, start: 0.000000, bitrate: 63 kb/s Stream #1.0: Audio: mp3, 22050 Hz, stereo, s16, 64 kb/s Output #0, avi, to 'output.avi': Stream #0.0: Video: mpeg4, yuv420p, 576x432, q=2-31, 90k tbn, 25 tbc Stream #0.1: Audio: libmp3lame, 22050 Hz, stereo, s16, 64 kb/s Stream mapping: Stream #0.0 -> #0.0 Stream #1.0 -> #0.1 Press [q] to stop encoding frame= 6157 fps=6156 q=-1.0 size= 31667kB time=246.28 bitrate=1053.3kbits/s Do you have an idea why it does not work with two audio streams? By the way, ffmpeg -i input_with_first_audio_stream.avi -i second.mp3 -acodec copy -vcodec copy output_two_audio_streams.avi -newaudio works with both versions of ffmpeg that I use, but the first audio stream is played too fast (x10 or more), while the second audio stream is played correct. Many thanks in advance and sorry for my unconventional question and outdated versions of ffmpeg. But I am a lamer and it is not so easy for me to compile from the source (especially for the Android version). I will try to compile an up to date version of ffmpeg with Ubuntu, but I don't have much free time.

    Read the article

  • Virtual Sd-card won't transfer [migrated]

    - by Hyztname
    I have a Verizon Galaxy Nexus 32GB. I've installed AOKP on it and it has been running pretty okay with no bugs etc... But after I connected it too my Nintendo Wii I think it mounted sdcard as FAT32 itself :\ I'm receiving the following error when trying to download something from 4Shared sync app(But actually none can download or transfer anything to Virtual sdcard): XXXXX.apk: open failed: EACCES(Permission denied) I already restored my old backup wiped all data etc, nothing seems to work. Note that I can access my sd-card, i can't only transfer, take pics... basically data. PS: I showed 4sync problem because it was the only one that specify something

    Read the article

  • Video Recording Not Working in ICS

    - by Nirav Ranpara
    I have implement code Record video in Android Phone . This code is working in 2.2 , 2.3 . not in ICS But when I checked in ICS code is not working ? here I posted code and xml file. videorecord.java import java.io.File; import java.io.IOException; import android.app.Activity; import android.app.AlertDialog; import android.content.Context; import android.content.DialogInterface; import android.content.Intent; import android.content.SharedPreferences; import android.hardware.Camera; import android.media.CamcorderProfile; import android.media.MediaRecorder; import android.os.Bundle; import android.os.CountDownTimer; import android.os.Environment; import android.util.Log; import android.view.Display; import android.view.KeyEvent; import android.view.SurfaceHolder; import android.view.SurfaceView; import android.view.View; import android.widget.EditText; import android.widget.FrameLayout; import android.widget.ImageView; import android.widget.LinearLayout; import android.widget.TextView; import android.widget.Toast; public class videorecord extends Activity{ SharedPreferences.Editor pre; String filename; CountDownTimer t; private Camera myCamera; private MyCameraSurfaceView myCameraSurfaceView; private MediaRecorder mediaRecorder; Integer cnt=0; LinearLayout myButton; TextView myButton1; SurfaceHolder surfaceHolder; boolean recording; private TextView txtcount; private ImageView btnplay; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); recording = false; setContentView(R.layout.videorecord); init(); myCamera = getCameraInstance(); if(myCamera == null){ } myCameraSurfaceView = new MyCameraSurfaceView(this, myCamera); FrameLayout myCameraPreview = (FrameLayout)findViewById(R.id.videoview); Display display = getWindowManager().getDefaultDisplay(); int width = display.getWidth(); int height = display.getHeight(); myCameraSurfaceView.setLayoutParams(new LinearLayout.LayoutParams(width, height-60)); myCameraPreview.addView(myCameraSurfaceView); myButton = (LinearLayout)findViewById(R.id.mybutton); btnplay.setOnClickListener(myButtonOnClickListener); } private void init() { txtcount = (TextView) findViewById(R.id.txtcounter); //myButton1 = (TextView) findViewById(R.id.mybutton1); btnplay = (ImageView)findViewById(R.id.btnplay); t = new CountDownTimer( Long.MAX_VALUE , 1000) { @Override public void onTick(long millisUntilFinished) { cnt++; String time = new Integer(cnt).toString(); long millis = cnt; int seconds = (int) (millis / 60); int minutes = seconds / 60; seconds = seconds % 60; txtcount.setText(String.format("%d:%02d:%02d", minutes, seconds,millis)); } @Override public void onFinish() { } }; } @Override public boolean onKeyDown(int keyCode, KeyEvent event) { if ((keyCode == KeyEvent.KEYCODE_BACK)) { if(recording) { new AlertDialog.Builder(videorecord.this).setTitle("Do you want to save Video ?") .setPositiveButton("OK", new DialogInterface.OnClickListener() { public void onClick(DialogInterface dialog, int which) { filename(); //finish(); } }).setNegativeButton("Cancle", new DialogInterface.OnClickListener() { public void onClick(DialogInterface dialog, int which) { // TODO Auto-generated method stub } }).show(); } else { if ((keyCode == KeyEvent.KEYCODE_BACK)) { //Intent homeIntent= new Intent(Intent.ACTION_MAIN); //homeIntent.addCategory(Intent.CATEGORY_HOME); //homeIntent.setFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP); //startActivity(homeIntent); //this.finishActivity(1); finish(); } //moveTaskToBack(true); // finish(); return super.onKeyDown(keyCode, event); } } else { // Toast.makeText(getApplicationContext(), "asd", Toast.LENGTH_LONG).show(); android.os.Process.killProcess(android.os.Process.myPid()) ; } return super.onKeyDown(keyCode, event); } ImageView.OnClickListener myButtonOnClickListener = new ImageView.OnClickListener(){ public void onClick(View v) { if(recording){ Log.e("Record error", "error in recording ."); mediaRecorder.stop(); t.cancel(); filename(); releaseMediaRecorder(); }else{ releaseCamera(); Log.e("Record Stop error", "error in recording ."); // if(!prepareMediaRecorder()){ prepareMediaRecorder(); finish(); } mediaRecorder.start(); recording = true; // myButton1.setText("STOP Recording"); // btnplay.setImageResource(android.R.drawable.ic_media_pause); btnplay.setImageResource(R.drawable.stoprec); t.start(); } }}; private Camera getCameraInstance(){ Camera c = null; try { c = Camera.open(); } catch (Exception e){ } return c; } private void filename() { AlertDialog.Builder alert = new AlertDialog.Builder(this); alert.setTitle("Save Video"); alert.setMessage("Enter File Name"); final EditText input = new EditText(this); alert.setView(input); alert.setPositiveButton("Ok", new DialogInterface.OnClickListener() { public void onClick(DialogInterface dialog, int whichButton) { if(input.getText().length()>=1) { filename = input.getText().toString(); File sdcard = new File(Environment.getExternalStorageDirectory() + "/VideoRecord"); File from = new File(sdcard,"null.mp4"); File to = new File(sdcard,filename+".mp4"); from.renameTo(to); SharedPreferences sp = videorecord.this.getSharedPreferences("data", MODE_WORLD_WRITEABLE); pre = sp.edit(); pre.clear(); pre.commit(); pre.putString("lastvideo", filename+".mp4"); pre.commit(); //btnplay.setImageResource(android.R.drawable.ic_media_play); btnplay.setImageResource(R.drawable.startrec); // Intent intent = new Intent(videorecord.this,StopVidoWatch_Activity.class); // startActivity(intent); Intent myIntent = new Intent(getApplicationContext(), StopVidoWatch_Activity.class).setFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP); startActivity(myIntent); } else { filename(); } } }); alert.setNegativeButton("Cancel", new DialogInterface.OnClickListener() { public void onClick(DialogInterface dialog, int whichButton) { // Intent intent = new Intent(videorecord.this,StopVidoWatch_Activity.class); // startActivity(intent); File file = new File(Environment.getExternalStorageDirectory() + "/VideoRecord/null.mp4"); //boolean deleted = file.delete(); file.delete(); finish(); } }); alert.show(); } private boolean prepareMediaRecorder(){ myCamera = getCameraInstance(); mediaRecorder = new MediaRecorder(); myCamera.unlock(); mediaRecorder.setCamera(myCamera); mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER); mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA); mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH)); File folder = new File(Environment.getExternalStorageDirectory() + "/VideoRecord"); boolean success = false; if (!folder.exists()) { success = folder.mkdir(); } if (!success) { } else { } mediaRecorder.setOutputFile("/sdcard/VideoRecord/"+filename+".mp4"); mediaRecorder.setMaxDuration(60000); mediaRecorder.setMaxFileSize(5000000); Display display = getWindowManager().getDefaultDisplay(); int width = display.getHeight(); int height = display.getWidth(); String s = new String(); s= s.valueOf(width); String s1 = new String(); s1= s1.valueOf(height); // Toast.makeText(videorecord.this, "Width : " + s , Toast.LENGTH_LONG).show(); // Toast.makeText(videorecord.this, "Height : " + s1 , Toast.LENGTH_LONG).show(); mediaRecorder.setVideoSize(height, width); mediaRecorder.setPreviewDisplay(myCameraSurfaceView.getHolder().getSurface()); try { mediaRecorder.prepare(); } catch (IllegalStateException e) { releaseMediaRecorder(); return false; } catch (IOException e) { releaseMediaRecorder(); return false; } return true; } @Override protected void onPause() { super.onPause(); releaseMediaRecorder(); releaseCamera(); } private void releaseMediaRecorder() { if (mediaRecorder != null) { mediaRecorder.reset(); mediaRecorder.release(); mediaRecorder = null; myCamera.lock(); } } private void releaseCamera(){ if (myCamera != null){ myCamera.release(); myCamera = null; } } public class MyCameraSurfaceView extends SurfaceView implements SurfaceHolder.Callback{ private SurfaceHolder mHolder; private Camera mCamera; public MyCameraSurfaceView(Context context, Camera camera) { super(context); mCamera = camera; mHolder = getHolder(); mHolder.addCallback(this); mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); } public void surfaceChanged(SurfaceHolder holder, int format, int weight, int height) { if (mHolder.getSurface() == null){ return; } try { mCamera.stopPreview(); } catch (Exception e){ } try { mCamera.setPreviewDisplay(mHolder); mCamera.startPreview(); } catch (Exception e){ } } public void surfaceCreated(SurfaceHolder holder) { try { mCamera.setPreviewDisplay(holder); mCamera.startPreview(); } catch (IOException e) { } } public void surfaceDestroyed(SurfaceHolder holder) { } } } videorecord.xml <?xml version="1.0" encoding="utf-8"?> <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" android:orientation="vertical" android:layout_width="fill_parent" android:layout_height="fill_parent" > <FrameLayout android:layout_width="fill_parent" android:layout_height="fill_parent" > <FrameLayout android:id="@+id/videoview" android:layout_width="fill_parent" android:layout_height="fill_parent"></FrameLayout> <LinearLayout android:id="@+id/mybutton" android:layout_width="fill_parent" android:layout_marginBottom="0dip" android:layout_height="wrap_content" android:orientation="horizontal" android:layout_weight="0" > <!-- <TextView android:text="START Recording" android:id="@+id/mybutton1" android:layout_height="wrap_content" android:layout_width="wrap_content" style="@style/savestyle" android:layout_weight="1" android:gravity="left" > </TextView> --> <ImageView android:layout_height="wrap_content" android:id="@+id/btnplay" android:padding="5dip" android:background="#A0000000" android:textColor="#ffffffff" android:layout_width="wrap_content" android:src="@drawable/startrec" /> </LinearLayout> <TextView android:text="00:00:00" android:id="@+id/txtcounter" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_gravity="right|bottom" android:padding="5dip" android:background="#A0000000" android:textColor="#ffffffff" /> </FrameLayout> <RelativeLayout android:layout_width="fill_parent" android:layout_height="fill_parent" android:background="@color/bgcolor" > <LinearLayout android:layout_above="@+id/mybutton" android:orientation="horizontal" android:layout_width="fill_parent" android:layout_height="fill_parent" > </LinearLayout> </RelativeLayout> </LinearLayout>

    Read the article

  • How to create named pipe (mkfifo) in Android?

    - by Ignas Limanauskas
    I am having trouble in creating named pipe in Android and the example below illustrates my dilemma: res = mkfifo("/sdcard/fifo9000", S_IRWXO); if (res != 0) { LOG("Error while creating a pipe (return:%d, errno:%d)", res, errno); } The code always prints: Error while creating a pipe (return:-1, errno:1) I can't figure out exactly why this fails. The application has android.permission.WRITE_EXTERNAL_STORAGE permissions. I can create normal files with exactly the same name in the same location, but pipe creation fails. The pipe in question should be accessible from multiple applications. I suspect that noone can create pipes in /sdcard. Where would it be the best location to do so? What mode mast should I set (2nd parameter)? Does application need any extra permissions?

    Read the article

  • Image rescale and write rescaled image file in blackberry

    - by Karthick
    I am using the following code to resize and save the file in to the blackberry device. After image scale I try to write image file into device. But it gives the same data. (Height and width of the image are same).I have to make rescaled image file.Can anyone help me ??? class ResizeImage extends MainScreen implements FieldChangeListener { private String path="file:///SDCard/BlackBerry/pictures/test.jpg"; private ButtonField btn; ResizeImage() { btn=new ButtonField("Write File"); btn.setChangeListener(this); add(btn); } public void fieldChanged(Field field, int context) { if (field == btn) { try { InputStream inputStream = null; //Get File Connection FileConnection fileConnection = (FileConnection) Connector.open(path); if (fileConnection.exists()) { inputStream = fileConnection.openInputStream(); //byte data[]=inputStream.toString().getBytes(); ByteArrayOutputStream baos = new ByteArrayOutputStream(); int j = 0; while((j=inputStream.read()) != -1) { baos.write(j); } byte data[] = baos.toByteArray(); inputStream.close(); fileConnection.close(); WriteFile("file:///SDCard/BlackBerry/pictures/org_Image.jpg",data); EncodedImage eImage = EncodedImage.createEncodedImage(data,0,data.length); int scaleFactorX = Fixed32.div(Fixed32.toFP(eImage.getWidth()), Fixed32.toFP(80)); int scaleFactorY = Fixed32.div(Fixed32.toFP(eImage.getHeight()), Fixed32.toFP(80)); eImage=eImage.scaleImage32(scaleFactorX, scaleFactorY); WriteFile("file:///SDCard/BlackBerry/pictures/resize.jpg",eImage.getData()); BitmapField bit=new BitmapField(eImage.getBitmap()); add(bit); } } catch(Exception e) { System.out.println("Exception is ==> "+e.getMessage()); } } } void WriteFile(String fileName,byte[] data) { FileConnection fconn = null; try { fconn = (FileConnection) Connector.open(fileName,Connector.READ_WRITE); } catch (IOException e) { System.out.print("Error opening file"); } if (fconn.exists()) try { fconn.delete(); } catch (IOException e) { System.out.print("Error deleting file"); } try { fconn.create(); } catch (IOException e) { System.out.print("Error creating file"); } OutputStream out = null; try { out = fconn.openOutputStream(); } catch (IOException e) { System.out.print("Error opening output stream"); } try { out.write(data); } catch (IOException e) { System.out.print("Error writing to output stream"); } try { fconn.close(); } catch (IOException e) { System.out.print("Error closing file"); } } }

    Read the article

  • Best way to store application images taken via camera

    - by Dave
    Hi all, I'm just looking for some insight into what would be the best way for me to store images as part of my app. I have an activity that represents a 'Job' which has a couple of edittext's and underneath was planning on using the Gallery component to show images relevant to this job. The job data is stored in a database (on the sdcard) so was also thinking of creating a table to store 'JobImages' and having each image stored as a byte array. But I'm not sure if it would be better to store the images directly on sdcard under a folder structure specific to my application and the job. E.g. using the job ID number as a folder name. Depending on which method I use will greatly determine the code that goes into an 'adapter' that allows me to bind to the gallery component so before I begin I was wondering if anyone has had the same design problem and what option they chose. Thanks, Dave

    Read the article

  • Copying to /system - android

    - by user1675783
    I am been trying to copy a apk from assets of another apk to /system. Here is what I have done,it was working in my previous app but not in this.I have added permission for wrtiting external storage. It is successfully copying to internal storage,not not to /system. Is there any way to directly copy to /system? copyStream("y.apk","/sdcard/x.apk"); Process mSuProcess; mSuProcess = Runtime.getRuntime().exec("su"); new DataOutputStream(mSuProcess.getOutputStream()).writeBytes("mount -o remount rw /system"); DataOutputStream mSuDataOutputStream = new DataOutputStream(mSuProcess.getOutputStream()); mSuDataOutputStream.writeBytes("cp /sdcard/x.apk /system/app/x.apk"); mSuDataOutputStream.writeBytes("exit\n");

    Read the article

  • Getting the number of frames from a video in Android

    - by Jay Patel
    I want to get the number of frames from a video. I'm using the following code: package com.vidualtest; import java.io.File; import java.io.FileDescriptor; import android.app.Activity; import android.graphics.Bitmap; import android.media.MediaMetadataRetriever; import android.os.Bundle; import android.os.Environment; import android.widget.ImageView; public class VidualTestActivity extends Activity { /** Called when the activity is first created. */ File file; ImageView img; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); img = (ImageView) findViewById(R.id.img); File sdcard = Environment.getExternalStorageDirectory(); file = new File(sdcard, "myvid.mp4"); MediaMetadataRetriever retriever = new MediaMetadataRetriever(); try { retriever.setDataSource(file.getAbsolutePath()); img.setImageBitmap(retriever.getFrameAtTime(10000,MediaMetadataRetriever.OPTION_CLOSEST)); } catch (IllegalArgumentException ex) { ex.printStackTrace(); } catch (RuntimeException ex) { ex.printStackTrace(); } finally { try { retriever.release(); } catch (RuntimeException ex) { } } } } In getFrameAtTime() I'm passing different static time values like 10000, 20000, etc. in milliseconds, but still I'm getting the same frame from the video. My goal is to get different frames with a different time interval.

    Read the article

  • Android: canvas.drawBitmap with Orientation(MagnetField)Sensor

    - by user368374
    hallo, i am developing Android app with orientation sensor. now i got problem. What i want to do is, change scale and position of bitmap which read from sd card. The scale and position depend on value from orientation sensor. i use canvas.drawBitmap(), then it cause problem. the app is just shut down. other drawXXX()methods have no problem..any suggestion? public class AnMagImgtestActivity extends Activity implements SensorEventListener { private SensorManager sensorManager; private MySurfaceView view; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); sensorManager = (SensorManager)getSystemService(SENSOR_SERVICE); view = new MySurfaceView(this); ///make it fullscreen getWindow().addFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN); requestWindowFeature(Window.FEATURE_NO_TITLE); ///give screen a hiropon getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON); setContentView(view); } @Override protected void onResume() { super.onResume(); List sensors = sensorManager.getSensorList(Sensor.TYPE_ORIENTATION); if (sensors.size() 0) { Sensor sensor = sensors.get(0); sensorManager.registerListener(this, sensor, SensorManager.SENSOR_DELAY_GAME); } } @Override protected void onPause() { super.onPause(); sensorManager.unregisterListener(this); } @Override public void onAccuracyChanged(Sensor sensor, int accuracy) { } @Override public void onSensorChanged(SensorEvent event) { view.onValueChanged(event.values); } class MySurfaceView extends SurfaceView implements SurfaceHolder.Callback{ public MySurfaceView(Context context) { super(context); getHolder().addCallback(this); } @Override public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) { onValueChanged(new float[3]); } @Override public void surfaceCreated(SurfaceHolder holder) { } @Override public void surfaceDestroyed(SurfaceHolder holder) { } void onValueChanged(float[] values) { Canvas canvas = getHolder().lockCanvas(); //String imgfn = "/sdcard/seekcamera"+"/"+regcode+"/to"+"/"+tempnum+".jpg"; String imgfn = "/sdcard/seekcamera"+"/de"+"/to"+"1.jpg"; //String imgfn = "/sdcard/seekcamera"+"/00"+"/00"+"/"+"42.jpg"; File mf = new File(imgfn); Bitmap bitmap0 = BitmapFactory.decodeFile(mf.getPath()); if (canvas != null) { Paint paint = new Paint(); paint.setAntiAlias(true); paint.setColor(Color.GREEN); paint.setTextSize(12); canvas.drawColor(Color.BLACK); paint.setAlpha(200); for (int i = 0; i < values.length; i++) { canvas.drawText("values[" + i + "]: " + values[i], 0, paint.getTextSize() * (i + 1), paint); } paint.setColor(Color.WHITE); paint.setTextSize(100+values[2]); paint.setAlpha( (int) (255-values[2])); canvas.drawText("Germany", values[0]*1+20, paint.getTextSize() * 1+values[2]-80, paint); ///here is the problem.. canvas.drawBitmap(bitmap0, values[0],values[1], paint); getHolder().unlockCanvasAndPost(canvas); } } } }

    Read the article

  • how can convert bitmap to byte array

    - by narasimha
    hi sir i am implementing image upload in sdcard image converting bitmap in bitmap convert in bytearray i am implementing this code import java.io.ByteArrayOutputStream; import java.io.DataInputStream; import java.io.EOFException; import java.io.File; import java.io.FileDescriptor; import java.io.FileInputStream; import java.io.FileNotFoundException; import java.io.IOException; import android.R.array; import android.app.Activity; import android.graphics.Bitmap; import android.graphics.BitmapFactory; import android.os.Bundle; import android.util.Log; import android.widget.ImageView; public class Photo extends Activity { @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); File f = new File("/sdcard/DCIM/1.jpg"); FileInputStream is = null; try { is = new FileInputStream(f); Bitmap bm; bm = BitmapFactory.decodeStream(is,null,null); ByteArrayOutputStream baos = new ByteArrayOutputStream(1000); bm.compress(Bitmap.CompressFormat.JPEG,75, baos); System.out.println("3........................"+bm); ImageView pic=(ImageView)this.findViewById(R.id.picview); pic.setImageBitmap(bm); } catch (Exception e) { // TODO: handle exception e.printStackTrace(); } } }this code is i am implementing how can convert bitmap in byte array INFO/System.out(12658): 3........................android.graphics.Bitmap@4358e3d0 in debug this will be displayed how can retrieve bitmap to byte array

    Read the article

  • Android- Using DexClassLoader to load apk file.

    - by Craig O Connor
    Hi guys, I've hit a bit of a wall. Any help would be appreciated. I have an app that I want to use DexClassLoader to load another apk file. Here is my code: DexClassLoader dLoader = new DexClassLoader("/sdcard/download/test.apk","/sdcard/download",null,ClassLoader.getSystemClassLoader().getParent()); Class calledClass = dLoader.loadClass("com.test.classname"); Intent it=new Intent(this, calledClass); it.setClassName("com.test", "com.test.classname"); startActivity(it); Now I had already installed test.apk so when I ran the above code it worked fine and launched the application. However I want to be able to run this without test.apk being installed already (as that would defeat the entire point of the application) . So I uninstalled it and when I ran the my app again I get this error: android.content.ActivityNotFoundException: Unable to find explicit activity class {com.test/com.test.classname}; have you declared this activity in your AndroidManifest.xml. So I'm a bit stumped here. This activity is declared in the Manifest of the apk I am trying to run. I can't declare it in my applications Manifest. Any ideas? Thanks, Craig

    Read the article

  • OutofMemoryError: bitmap size exceeds VM budget (Android)

    - by Chrispix
    Getting an Exception in the BitmapFactory. Not sure what is the issue. (Well I can guess the issue, but not sure why its happening) ERROR/AndroidRuntime(7906): java.lang.OutOfMemoryError: bitmap size exceeds VM budget ERROR/AndroidRuntime(7906): at android.graphics.BitmapFactory.decodeFile(BitmapFactory.java:295) My code is pretty straight forward. I defined an XML layout w/ a default image. I try to load a bm on the SDCard (if present - it is). If not it shows the default image. Anyway.. Here is code : public class showpicture extends Activity { public void onCreate(Bundle savedInstanceState) { /** Remove menu/status bar **/ requestWindowFeature(Window.FEATURE_NO_TITLE); final Window win = getWindow(); win.setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,WindowManager.LayoutParams.FLAG_FULLSCREEN); Bitmap bm; super.onCreate(savedInstanceState); setContentView(R.layout.showpicture); try { ImageView mImageButton = (ImageView)findViewById(R.id.displayPicture); bm = Bitmap.createScaledBitmap(BitmapFactory.decodeFile("/sdcard/dcim/Camera/20091018203339743.jpg"),100, 100, true); parkImageButton.setImageBitmap(bm); } catch (IllegalArgumentException ex) { Log.d("MYAPP",ex.getMessage()); } catch (IllegalStateException ex) { It fails on the bm=Bitmap.createScaledBitmap any thoughts? I did some research on the forums, and it pointed to this post I just don't know why it is not working. Any help would be great! Thanks, Chris.

    Read the article

  • how to implement bitmap to byte array in android

    - by satyamurthy
    hi sir i am implementing image upload in sdcard image converting bitmap in bitmap convert in bytearray i am implementing this code import java.io.ByteArrayOutputStream; import java.io.DataInputStream; import java.io.EOFException; import java.io.File; import java.io.FileDescriptor; import java.io.FileInputStream; import java.io.FileNotFoundException; import java.io.IOException; import android.R.array; import android.app.Activity; import android.graphics.Bitmap; import android.graphics.BitmapFactory; import android.os.Bundle; import android.util.Log; import android.widget.ImageView; public class Photo extends Activity { @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); File f = new File("/sdcard/DCIM/1.jpg"); FileInputStream is = null; try { is = new FileInputStream(f); Bitmap bm; bm = BitmapFactory.decodeStream(is,null,null); ByteArrayOutputStream baos = new ByteArrayOutputStream(1000); bm.compress(Bitmap.CompressFormat.JPEG,75, baos); System.out.println("3........................"+bm); ImageView pic=(ImageView)this.findViewById(R.id.picview); pic.setImageBitmap(bm); } catch (Exception e) { // TODO: handle exception e.printStackTrace(); } } }this code is i am implementing how can convert bitmap in byte array INFO/System.out(12658): 3........................android.graphics.Bitmap@4358e3d0 in debug this will be displayed how can retrieve bitmap to byte array

    Read the article

  • ListField with image In Blackberry JDE

    - by Karthick
    I use the following code to retrieve image from the phone or SDCard and I use that image in to my ListField. It gives the output but it takes very Long time to produce the screen. How to solve this problem ?? Can any one help me?? Thanks in advance!!! String text = fileholder.getFileName(); try{ String path="file:///"+fileholder.getPath()+text; //path=”file:///SDCard/BlackBerry/pictures/image.bmp” InputStream inputStream = null; //Get File Connection FileConnection fileConnection = (FileConnection) Connector.open(path); inputStream = fileConnection.openInputStream(); ByteArrayOutputStream baos = new ByteArrayOutputStream(); int j = 0; while((j=inputStream.read()) != -1) { baos.write(j); } byte data[] = baos.toByteArray(); inputStream.close(); fileConnection.close(); //Encode and Resize image EncodedImage eImage = EncodedImage.createEncodedImage(data,0,data.length); int scaleFactorX = Fixed32.div(Fixed32.toFP(eImage.getWidth()), Fixed32.toFP(180)); int scaleFactorY = Fixed32.div(Fixed32.toFP(eImage.getHeight()), Fixed32.toFP(180)); eImage=eImage.scaleImage32(scaleFactorX, scaleFactorY); Bitmap bitmapImage = eImage.getBitmap(); graphics.drawBitmap(0, y+1, 40, 40,bitmapImage, 0, 0); graphics.drawText(text, 25, y,0,width); } catch(Exception e){}

    Read the article

  • Media query from a specific folder

    - by sensei
    I would like to understand how I can use a cursor to jpg files in a folder specified in the sdcard. I'm trying to select with a cursor the jpg files in a specific folder, and I tried this: This is the code: public static Uri getRandomImage(ContentResolver resolver) { String[] projection = new String[] { BaseColumns._ID, }; String folder = "/sdcard/DCIM/Wallpaper/"; folder = folder + "%"; Uri uri = Media.EXTERNAL_CONTENT_URI; String[] whereArgs = new String[]{folder}; Cursor cursor = resolver.query(uri, projection, null, whereArgs, MediaColumns._ID); if (cursor == null || cursor.getCount() <= 0) { return null; } cursor.moveToPosition(new Random().nextInt(cursor.getCount())); return Uri.withAppendedPath(uri, cursor.getString(0)); } but this code gives me error here is the logcat: E/AndroidRuntime(11986): FATAL EXCEPTION: main E/AndroidRuntime(11986): android.database.sqlite.SQLiteException: bind or column index out of range: handle 0x26a490 E/AndroidRuntime(11986): at android.database.DatabaseUtils.readExceptionFromParcel(DatabaseUtils.java:158) E/AndroidRuntime(11986): at android.database.DatabaseUtils.readExceptionFromParcel(DatabaseUtils.java:114) E/AndroidRuntime(11986): at android.content.ContentProviderProxy.bulkQueryInternal(ContentProviderNative.java:330) E/AndroidRuntime(11986): at android.content.ContentProviderProxy.query(ContentProviderNative.java:366) E/AndroidRuntime(11986): at android.content.ContentResolver.query(ContentResolver.java:245) E/AndroidRuntime(11986): at it.bisemanuDEV.slidepuzzle.SelectImagePreference.getRandomImage(SelectImagePreference.java:126) E/AndroidRuntime(11986): at it.bisemanuDEV.slidepuzzle.TileView.newGame(TileView.java:156) E/AndroidRuntime(11986): at it.bisemanuDEV.slidepuzzle.SlidePuzzleActivity.onOptionsItemSelected(SlidePuzzleActivity.java:377) E/AndroidRuntime(11986): at android.app.Activity.onMenuItemSelected(Activity.java:2762) E/AndroidRuntime(11986): at com.android.internal.policy.impl.PhoneWindow.onMenuItemSelected(PhoneWindow.java:730) E/AndroidRuntime(11986): at com.android.internal.view.menu.MenuItemImpl.invoke(MenuItemImpl.java:143) E/AndroidRuntime(11986): at com.android.internal.view.menu.MenuBuilder.performItemAction(MenuBuilder.java:855) E/AndroidRuntime(11986): at com.android.internal.view.menu.IconMenuView.invokeItem(IconMenuView.java:532) E/AndroidRuntime(11986): at com.android.internal.view.menu.IconMenuItemView.performClick(IconMenuItemView.java:122) E/AndroidRuntime(11986): at android.view.View$PerformClick.run(View.java:8819) E/AndroidRuntime(11986): at android.os.Handler.handleCallback(Handler.java:603) E/AndroidRuntime(11986): at android.os.Handler.dispatchMessage(Handler.java:92) E/AndroidRuntime(11986): at android.os.Looper.loop(Looper.java:123) E/AndroidRuntime(11986): at android.app.ActivityThread.main(ActivityThread.java:4627) E/AndroidRuntime(11986): at java.lang.reflect.Method.invokeNative(Native Method) E/AndroidRuntime(11986): at java.lang.reflect.Method.invoke(Method.java:521) E/AndroidRuntime(11986): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:868) E/AndroidRuntime(11986): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:626) E/AndroidRuntime(11986): at dalvik.system.NativeStart.main(Native Method)

    Read the article

  • how to convert bitmap into byte array in android

    - by satyamurthy
    hi all i am new in android i am implementing image retrieve in sdcard in image convert into bitmap and in bitmap convert in to byte array please forward some solution of this code public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); ImageView image = (ImageView) findViewById(R.id.picview); EditText value=(EditText)findViewById(R.id.EditText01); FileInputStream in; BufferedInputStream buf; try { in = new FileInputStream("/sdcard/pictures/1.jpg"); buf = new BufferedInputStream(in,1070); System.out.println("1.................."+buf); byte[] bMapArray= new byte[buf.available()]; buf.read(bMapArray); Bitmap bMap = BitmapFactory.decodeByteArray(bMapArray, 0, bMapArray.length); for (int i = 0; i < bMapArray.length; i++) { System.out.print("bytearray"+bMapArray[i]); } image.setImageBitmap(bMap); value.setText(bMapArray.toString()); if (in != null) { in.close(); } if (buf != null) { buf.close(); } } catch (Exception e) { Log.e("Error reading file", e.toString()); } } } solution is 04-12 16:41:16.168: INFO/System.out(728): 4......................[B@435a2908 this is the result for byte array not display total byte array this array size is 1034 please forward some solution

    Read the article

  • Android, how to use DexClassLoader to dynamically replace an Activity or Service

    - by RickNotFred
    I am trying to do something similar to this stackoverflow posting. What I want to do is to read the definition of an activity or service from the SD card. To avoid manifest permission issues, I create a shell version of this activity in the .apk, but try to replace it with an activity of the same name residing on the SD card at run time. Unfortunately, I am able to load the activity class definition from the SD card using DexClassLoader, but the original class definition is the one that is executed. Is there a way to specify that the new class definition replaces the old one, or any suggestions on avoiding the manifest permission issues without actually providing the needed activity in the package? The code sample: ClassLoader cl = new DexClassLoader("/sdcard/mypath/My.apk", getFilesDir().getAbsolutePath(), null, MainActivity.class.getClassLoader()); try { Class<?> c = cl.loadClass("com.android.my.path.to.a.loaded.activity"); Intent i = new Intent(getBaseContext(), c); startActivity(i); } catch (Exception e) { Intead of launching the com.android.my.path.to.a.loaded.activity specified in /sdcard/mypath/My.apk, it launches the activity statically loaded into the project.

    Read the article

  • how to fix error in bitmap size exceeds VM budget

    - by narasimha
    hi folks i am working one application image uploading to sdcard i am scaling that sdcard saved into database some times one error is occurs bitmap size exceeds vm budget ouput : 01-11 15:39:51.809: ERROR/AndroidRuntime(6214): Uncaught handler: thread main exiting due to uncaught exception 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): java.lang.OutOfMemoryError: bitmap size exceeds VM budget 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at android.graphics.BitmapFactory.nativeDecodeByteArray(Native Method) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at android.graphics.BitmapFactory.decodeByteArray(BitmapFactory.java:384) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at android.graphics.BitmapFactory.decodeByteArray(BitmapFactory.java:397) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at com.fitzgeraldsoftware.shout.presentationLayer.Shout.onActivityResult(Shout.java:1653) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at android.app.Activity.dispatchActivityResult(Activity.java:3624) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at android.app.ActivityThread.deliverResults(ActivityThread.java:3220) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at android.app.ActivityThread.handleSendResult(ActivityThread.java:3266) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at android.app.ActivityThread.access$2600(ActivityThread.java:116) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1823) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at android.os.Handler.dispatchMessage(Handler.java:99) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at android.os.Looper.loop(Looper.java:123) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at android.app.ActivityThread.main(ActivityThread.java:4203) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at java.lang.reflect.Method.invokeNative(Native Method) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at java.lang.reflect.Method.invoke(Method.java:521) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:791) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:549) 01-11 15:39:51.979: ERROR/AndroidRuntime(6214): at dalvik.system.NativeStart.main(Native Method) how can fix the error please forward some solution thanks in advance

    Read the article

  • Weird camera Intent behavior

    - by David Erosa
    Hi all. I'm invoking the MediaStore.ACTION_IMAGE_CAPTURE intent with the MediaStore.EXTRA_OUTPUT extra so that it does save the image to that file. On the onActivityResult I can check that the image is being saved in the intended file, which is correct. The weird thing is that anyhow, the image is also saved in a file named something like "/sdcard/Pictures/Camera/1298041488657.jpg" (epoch time in which the image was taken). I've checked the Camera app source (froyo-release branch) and I'm almost sure that the code path is correct and wouldn't have to save the image, but I'm a noob and I'm not completly sure. AFAIK, the image saving process starts with this callback (comments are mine): private final class JpegPictureCallback implements PictureCallback { ... public void onPictureTaken(...){ ... // This is where the image is passed back to the invoking activity. mImageCapture.storeImage(jpegData, camera, mLocation); ... public void storeImage(final byte[] data, android.hardware.Camera camera, Location loc) { if (!mIsImageCaptureIntent) { // Am i an intent? int degree = storeImage(data, loc); // THIS SHOULD NOT BE CALLED WITHIN THE CAPTURE INTENT!! ....... // An finally: private int storeImage(byte[] data, Location loc) { try { long dateTaken = System.currentTimeMillis(); String title = createName(dateTaken); String filename = title + ".jpg"; // Eureka, timestamp filename! ... So, I'm receiving the correct data, but it's also being saved in the "storeImage(data, loc);" method call, which should not be called... It'd not be a problem if I could get the newly created filename from the intent result data, but I can't. When I found this out, I found about 20 image files from my tests that I didn't know were on my sdcard :) I'm getting this behavior both with my Nexus One with Froyo and my Huawei U8110 with Eclair. Could please anyone enlight me? Thanks a lot.

    Read the article

  • Play Video From Raw Folder

    - by SterAllures
    Evening, I've just started programming with android and made a few programs and everything so I'm still kind of a novice but im trying to understand it all. So here's my problem, I'm trying to play a video, the thing is, I got it working when I Stream it from an URL with VideoView over the internet or when i place in on my sdcard. What I want to do now is play a video I've got in my res/raw folder, but it only plays audio and I don't understand why, it doesn't give any error in my logcat as far as I can see, also couldn't really find a solution with google since most of the answers are about VideoView and just put the video on your SDCard. Now someone told me I had to use setDisplay (SurfaceHolder) and I've also tried that but I still only get the audio. I hope somebody can help me to find a solution to this problem. VideoDemo.java package nl.melvin.videodemo; import android.app.Activity; import android.os.Bundle; import android.media.MediaPlayer; import android.view.SurfaceHolder; import android.view.SurfaceView; public class videodemo extends Activity { public SurfaceHolder holder; public SurfaceView surfaceView; /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); MediaPlayer mp = MediaPlayer.create(this, R.raw.mac); mp.setDisplay(holder); mp.start(); } } XML <?xml version="1.0" encoding="utf-8"?> <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" android:id="@+id/LinearLayout01" android:layout_width="fill_parent" android:layout_height="fill_parent" > <SurfaceView android:id="@+id/surfaceview" android:layout_width="fill_parent" android:layout_height="fill_parent"> </SurfaceView>> </LinearLayout> I've also tried Uri.parse but it says it can't play the video (.mp4 format).

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8  | Next Page >