Search Results

Search found 11467 results on 459 pages for 'android contentprovider'.

Page 259/459 | < Previous Page | 255 256 257 258 259 260 261 262 263 264 265 266  | Next Page >

  • Hey Guy , I want ot streaming video by using VideView class . Can anyone tell me what format is it s

    - by eddyxd
    Hi , I am the newbie of android, but i hava seen the tutorial and implement some simple applications. The question i met is that I am tring to stream some video from my server to android, but the android VideoView class just plays the audition sololy without "image"@@!~ Here is my setting and android code : 1. android core code: mVideoView01.setVideoURI(Uri.parse("rtsp://192.168.16.1:8080/test.sdp")); mVideoView01.start(); 2. my streaming server is VLC and the command is: vlc -vvv d:\nobody.mp4 --sout=#transcode{vcodec=h264,width=320,hegiht=240}:rtp{dst=192.168.16.1,port=4444,sdp=rtsp://192.168.16.1:8080/test.sdp} ps: My ip is got from DHCP but I have checked it really can be connected(Android could play audition after all) ps2: I haved trid to stream some video from "http://www.americafree.tv/" and the playing is good!!@@ So I guess that the problem maybe is caused by streaming Video format, but I have almost tried every figument option form VLC, and it still don't workQQ. So Have anyone done the same test as me can give me some advice?? Thanks a lot!!!!! by eddy

    Read the article

  • Galaxy Nexus (Specificaly) camera startPreview failed

    - by Roman
    I'm having a strange issue on a galaxy nexus when trying to have a live camera view in my app. I get this error in the log cat: 06-29 16:31:26.681 I/CameraClient(133): Opening camera 0 06-29 16:31:26.681 I/CameraHAL(133): camera_device open 06-29 16:31:26.970 D/DOMX (133): ERROR: failed check:(eError == OMX_ErrorNone) || (eError == OMX_ErrorNoMore) - returning error: 0x80001005 - Error returned from OMX API in ducati 06-29 16:31:26.970 E/CameraHAL(133): Error while configuring rotation 0x80001005 06-29 16:31:27.088 I/am_on_resume_called(21274): [0,digifynotes.Activity_Camera] 06-29 16:31:27.111 V/PhoneStatusBar(693): setLightsOn(true) 06-29 16:31:27.205 E/CameraHAL(133): OMX component is not in loaded state 06-29 16:31:27.205 E/CameraHAL(133): setNSF() failed -22 06-29 16:31:27.205 E/CameraHAL(133): Error: CAMERA_QUERY_RESOLUTION_PREVIEW -22 06-29 16:31:27.252 I/MonoDroid(21274): UNHANDLED EXCEPTION: Java.Lang.Exception: Exception of type 'Java.Lang.Exception' was thrown. 06-29 16:31:27.252 I/MonoDroid(21274): at Android.Runtime.JNIEnv.CallVoidMethod (intptr,intptr) <0x00068> 06-29 16:31:27.252 I/MonoDroid(21274): at Android.Hardware.Camera.StartPreview () <0x0007f> 06-29 16:31:27.252 I/MonoDroid(21274): at DigifyNotes.CameraPreviewView.SurfaceChanged (Android.Views.ISurfaceHolder,Android.Graphics.Format,int,int) <0x000d7> 06-29 16:31:27.252 I/MonoDroid(21274): at Android.Views.ISurfaceHolderCallbackInvoker.n_SurfaceChanged_Landroid_view_SurfaceHolder_III (intptr,intptr,intptr,int,int,int) <0x0008b> 06-29 16:31:27.252 I/MonoDroid(21274): at (wrapper dynamic-method) object.4c65d912-497c-4a67-9046-4b33a55403df (intptr,intptr,intptr,int,int,int) <0x0006b> That very same source code works flawlessly on a Samsung Galaxy Ace 2X (4.0.4) and an LG G2X (2.3.7). I will later test on a galaxy s4 if my friend lends it to me. Galaxy Nexus runs Android 4.2.2 I believe. Any one have any ideas? EDIT: Here are my camera classes: [Please note I am using mono] [The formatting is more readable if you view it as raw] Camera Activity: http://pastebin.com/YPcGXJRB Camera Preview View: http://pastebin.com/zNf8AWDf

    Read the article

  • Can't seem to load a webview from xml, why?

    - by Bilthon
    Well, I'm trying to follow the tutorial from http://rapidandroid.org/wiki/Graphing. But I found a problem just in the first part of it. I'll describe the problem here, I just cannot understand how this could be wrong, it's pretty simple stuff. What am I doing wrong here? First I created a xml file called statistics.xml, in it among other things I put this code: <WebView android:id="@+id/webview" android:layout_height="fill_parent" android:layout_width="fill_parent"/> Now in the activity that is supposed to display this webview I'm doing this: public class DisplayStatistics extends Activity { public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); WebView wv = (WebView) findViewById(R.id.webview); The problem arises here, as I seem to be getting null for wv whenever I test for it. Which means of course that findViewById(R.id.webview) couldn't find the view. But again, what am I doing wrong?? Of course I know I could also instantiate the webview directly from code without the need to specify it from the xml, but I was just wondering what was wrong about this way of doing it. Just in case I also added the following line in my android manifest file: <uses-permission android:name="android.permission.INTERNET" /> I'm not really sure if it's necessary for displaying the webview (I think not) but I just put it there to see if that was the problem, and it turned out it isn't. Thanks Nelson

    Read the article

  • How to create Context Menu using XML file ?

    - by Kaillash
    I am using XML file for creating Context Menu for my ListView. (Please see below). I also want to set a header for this Context Menu. I read (at http://www.mail-archive.com/[email protected]/msg43062.html)that I can use menu.setHeaderTitle(myContextMenuTitle) in onCreateContextMenu Method. But I need to set this in XML file. How can I accomplish this? Following is code for onCreateContextMenu Method, correct me if I am doing anything wrong.. This is my context_menu.xml file: <?xml version="1.0" encoding="utf-8"?> <menu xmlns:android="http://schemas.android.com/apk/res/android"> <item android:id="@+id/open" android:title="Open"/> </menu> This is my onCreateContextMenu Method: @Override public void onCreateContextMenu(ContextMenu menu, View v, ContextMenuInfo menuInfo) { MenuInflater inflater = getMenuInflater(); inflater.inflate(R.menu.context_menu, menu); super.onCreateContextMenu(menu, v, menuInfo); } This is my onCreate Method: @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); // extras = getIntent().getExtras(); registerForContextMenu(getListView()); ... }

    Read the article

  • How to draw some lines in a view element defined in the xml layout

    - by Nils
    Hello, I have problems drawing some simple lines in a view object (Android programming). First I created the layout with the view element(kind of painting area) in it (XML file). [...] < View android:id="@+id/viewmap" android:layout_width="572px" android:layout_height="359px" android:layout_x="26px" android:layout_y="27px" [...] ... and tried then to access it to draw some lines. Unfortunately the program is running and other UI elements like buttons are displayed, but I can't see the drawings. What's wrong ? [...] viewmap = (View) findViewById(R.id.viewmap); Canvas canvas = new Canvas(); viewmap.draw(canvas); Paint p = new Paint(); p.setColor(Color.BLUE); p.setStyle(Paint.Style.STROKE); canvas.drawColor(Color.WHITE); p.setColor(Color.BLUE); canvas.drawLine(4, 4, 29, 5, p); p.setColor(Color.RED); viewmap.draw(canvas); [...] Thanks for help :) !

    Read the article

  • Developing cross platform mobile application

    - by sohilv
    More and more mobile platforms are being launched and sdk's are available to developers. There are various mobile platform are available, Android,iOS,Moblin,Windows mobile 7,RIM,symbian,bada,maemo etc. And making of corss platform application is headache for developers. I am searching common thing across the platforms which will help to developers who want to port application to all platforms.Like what is the diff screen resolution, input methods, open gl support etc. please share details that you know for the any of platform . or is there possibilities , by writing code in html (widget type of thing) and load it into native application. I know about the android , in which we can add the web view into application. by calling setContentView(view) Please share the class details where we can add the html view into native application of different type of platforms that you know. Purpose of this thread is share common details across developers. marking as community wiki. Cross platform tools & library XMLVM and iSpectrum (cross compile Java code from an Android app or creating one from scratch Phone Gap (cross platform mobile apps) Titanium (to build native mobile and desktop apps with web technologies) Mono Touch ( C# for iphone ) rhomobile - http://rhomobile.com/ samples are here: http://github.com/rhomobile/rhodes-system-api-samples Sencha Touch - Sencha Touch is a HTML5 mobile app framework that allows you to develop web apps that look and feel native on Apple iOS and Google Android touchscreen devices. http://www.sencha.com/products/touch/ Corona - Iphone/Ipad / Android application cross platform library . Too awesome. http://anscamobile.com/corona/ A guide to port existing Android app to Windows Phone 7 http://windowsphone.interoperabilitybridges.com/articles/windows-phone-7-guide-for-iphone-application-developers

    Read the article

  • Error in pom file in Maven project,after importing into eclipse

    - by dipti
    I am actually new to the Maven framework.I already have a Maven project.I installed the Maven plugin etc into my EclipseIDE from http://m2eclipse.sonatype.org/sites/m2e.Then I imported my project and enabled dependencies.But the project is showing too many errors.The pom.xml itself is showing errors.The errors are"Project Build Error:unknown packaging:apk",Project Build Error:unresolvable build extension:plugin" etc. My error area is: project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" 4.0.0 <groupId>com.nbc.cnbc.android</groupId> <artifactId>android.domain.tests</artifactId> <version>1.0.0</version> <packaging>apk</packaging> <parent> <groupId>com.nbc.cnbc.android</groupId> <artifactId>android.domain.parent</artifactId> <version>1.0.0</version> <relativePath>../android.domain.parent/pom.xml</relativePath> </parent> <name>android.domain.tests</name> <url>http://maven.apache.org</url> Could it be because the url specified in the last line could be different??? Any ideas why this could be happening?? Any reply is highly appreciated.Thanks a looot in advance!! Regards, Dipti

    Read the article

  • getSharedPreferences not working for me with concerns to ListPreferences and Integers

    - by ideagent
    I'm stuck at a point where I'm trying to get my project to read a preference value (from a ListPreference listing) and then use that value in a basic mathematical subtraction instance. The problem is that the "seek" preference is not being seen by my Java code, and yet the default value is (I've tried the default value with 3000 and now 0). Am i missing something, is there a bug here, known or unknown? Java code chunk where the issues manifests itself: public static final String PREF_FILE_NAME = "preferences"; seekback.setOnClickListener(new Button.OnClickListener() { @Override public void onClick(View view) { try { SharedPreferences preferences = getSharedPreferences(PREF_FILE_NAME, MODE_PRIVATE); Integer storedPreference = preferences.getInt("seek", 0); (mediaPlayer.getCurrentPosition()-storedPreference); } catch (Exception e) { e.printStackTrace(); } } }); Here are some other code bits for my project: From preferences file: <ListPreference android:entries="@array/seconds" android:entryValues="@array/seconds_values" android:summary="sets the seek interval for the seekback and seekforward buttons" android:title="Seek Interval" android:defaultValue="5000" android:key="@string/seek" From strings file: seek From an array file: Five seconds Fifteen seconds Thirty seconds Sixty seconds 5000 15000 30000 60000 let me know if you need to see more code to figure this one out Thanks in advance for any help that can be offered. I've worked over this issue now for a few hours and I'm burnt, a second pair of eyes on it would be very much appreciated. Arg, not sure how to get the code and plain text to format nicely here, even tried the options, like Code Sample, no luck AndroidCoder

    Read the article

  • How to change tab background on viewpagerindicator tabs?

    - by user1736277
    I am using ViewpagerIndicator library by Jake Wharton. I am using the tabs code in conjunction with the ActionBarSherlock library. Everything works fine, but I'm trying to style the background of the tabs and can't figure out how. I would like a dark action bar with dark tabs and light fragments (tab content). The base theme I am using is Theme.Sherlock.Light.DarkActionBar. I extend this style by making it the parent of a style that sets attributes for the tabs (like text color, indicator color, etc). This results in dark actionbar, light tabs, and light fragments. I can't find anything that will change the background of the tabs themselves. The only way I can change it is by changing the whole app to dark (using Theme.Sherlock). Here's my code so far: <style name="vpiTheme" parent="Theme.Sherlock.Light.DarkActionBar"> <item name="vpiTabPageIndicatorStyle">@style/CustomTabPageIndicator</item> </style> <style name="CustomTabPageIndicator" parent="Widget.TabPageIndicator"> <item name="android:textColor">#FF000000</item> <item name="android:paddingTop">6dp</item> <item name="android:paddingBottom">6dp</item> <item name="android:paddingLeft">16dip</item> <item name="android:paddingRight">16dip</item> <item name="android:maxLines">2</item> </style>

    Read the article

  • How to use onSensorChanged sensor data in combination with OpenGL

    - by Sponge
    I have written a TestSuite to find out how to calculate the rotation angles from the data you get in SensorEventListener.onSensorChanged(). I really hope you can complete my solution to help people who will have the same problems like me. Here is the code, i think you will understand it after reading it. Feel free to change it, the main idea was to implement several methods to send the orientation angles to the opengl view or any other target which would need it. method 1 to 4 are working, they are directly sending the rotationMatrix to the OpenGl view. all other methods are not working or buggy and i hope someone knows to get them working. i think the best method would be method 5 if it would work, because it would be the easiest to understand but i'm not sure how efficient it is. the complete code isn't optimized so i recommend to not use it as it is in your project. here it is: import java.nio.ByteBuffer; import java.nio.ByteOrder; import java.nio.FloatBuffer; import javax.microedition.khronos.egl.EGL10; import javax.microedition.khronos.egl.EGLConfig; import javax.microedition.khronos.opengles.GL10; import static javax.microedition.khronos.opengles.GL10.*; import android.app.Activity; import android.content.Context; import android.content.pm.ActivityInfo; import android.hardware.Sensor; import android.hardware.SensorEvent; import android.hardware.SensorEventListener; import android.hardware.SensorManager; import android.opengl.GLSurfaceView; import android.opengl.GLSurfaceView.Renderer; import android.os.Bundle; import android.util.Log; import android.view.WindowManager; /** * This class provides a basic demonstration of how to use the * {@link android.hardware.SensorManager SensorManager} API to draw a 3D * compass. */ public class SensorToOpenGlTests extends Activity implements Renderer, SensorEventListener { private static final boolean TRY_TRANSPOSED_VERSION = false; /* * MODUS overview: * * 1 - unbufferd data directly transfaired from the rotation matrix to the * modelview matrix * * 2 - buffered version of 1 where both acceleration and magnetometer are * buffered * * 3 - buffered version of 1 where only magnetometer is buffered * * 4 - buffered version of 1 where only acceleration is buffered * * 5 - uses the orientation sensor and sets the angles how to rotate the * camera with glrotate() * * 6 - uses the rotation matrix to calculate the angles * * 7 to 12 - every possibility how the rotationMatrix could be constructed * in SensorManager.getRotationMatrix (see * http://www.songho.ca/opengl/gl_anglestoaxes.html#anglestoaxes for all * possibilities) */ private static int MODUS = 2; private GLSurfaceView openglView; private FloatBuffer vertexBuffer; private ByteBuffer indexBuffer; private FloatBuffer colorBuffer; private SensorManager mSensorManager; private float[] rotationMatrix = new float[16]; private float[] accelGData = new float[3]; private float[] bufferedAccelGData = new float[3]; private float[] magnetData = new float[3]; private float[] bufferedMagnetData = new float[3]; private float[] orientationData = new float[3]; // private float[] mI = new float[16]; private float[] resultingAngles = new float[3]; private int mCount; final static float rad2deg = (float) (180.0f / Math.PI); private boolean mirrorOnBlueAxis = false; private boolean landscape; public SensorToOpenGlTests() { } /** Called with the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE); openglView = new GLSurfaceView(this); openglView.setRenderer(this); setContentView(openglView); } @Override protected void onResume() { // Ideally a game should implement onResume() and onPause() // to take appropriate action when the activity looses focus super.onResume(); openglView.onResume(); if (((WindowManager) getSystemService(WINDOW_SERVICE)) .getDefaultDisplay().getOrientation() == 1) { landscape = true; } else { landscape = false; } mSensorManager.registerListener(this, mSensorManager .getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_GAME); mSensorManager.registerListener(this, mSensorManager .getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD), SensorManager.SENSOR_DELAY_GAME); mSensorManager.registerListener(this, mSensorManager .getDefaultSensor(Sensor.TYPE_ORIENTATION), SensorManager.SENSOR_DELAY_GAME); } @Override protected void onPause() { // Ideally a game should implement onResume() and onPause() // to take appropriate action when the activity looses focus super.onPause(); openglView.onPause(); mSensorManager.unregisterListener(this); } public int[] getConfigSpec() { // We want a depth buffer, don't care about the // details of the color buffer. int[] configSpec = { EGL10.EGL_DEPTH_SIZE, 16, EGL10.EGL_NONE }; return configSpec; } public void onDrawFrame(GL10 gl) { // clear screen and color buffer: gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT); // set target matrix to modelview matrix: gl.glMatrixMode(GL10.GL_MODELVIEW); // init modelview matrix: gl.glLoadIdentity(); // move camera away a little bit: if ((MODUS == 1) || (MODUS == 2) || (MODUS == 3) || (MODUS == 4)) { if (landscape) { // in landscape mode first remap the rotationMatrix before using // it with glMultMatrixf: float[] result = new float[16]; SensorManager.remapCoordinateSystem(rotationMatrix, SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, result); gl.glMultMatrixf(result, 0); } else { gl.glMultMatrixf(rotationMatrix, 0); } } else { //in all other modes do the rotation by hand: gl.glRotatef(resultingAngles[1], 1, 0, 0); gl.glRotatef(resultingAngles[2], 0, 1, 0); gl.glRotatef(resultingAngles[0], 0, 0, 1); if (mirrorOnBlueAxis) { //this is needed for mode 6 to work gl.glScalef(1, 1, -1); } } //move the axis to simulate augmented behaviour: gl.glTranslatef(0, 2, 0); // draw the 3 axis on the screen: gl.glVertexPointer(3, GL_FLOAT, 0, vertexBuffer); gl.glColorPointer(4, GL_FLOAT, 0, colorBuffer); gl.glDrawElements(GL_LINES, 6, GL_UNSIGNED_BYTE, indexBuffer); } public void onSurfaceChanged(GL10 gl, int width, int height) { gl.glViewport(0, 0, width, height); float r = (float) width / height; gl.glMatrixMode(GL10.GL_PROJECTION); gl.glLoadIdentity(); gl.glFrustumf(-r, r, -1, 1, 1, 10); } public void onSurfaceCreated(GL10 gl, EGLConfig config) { gl.glDisable(GL10.GL_DITHER); gl.glClearColor(1, 1, 1, 1); gl.glEnable(GL10.GL_CULL_FACE); gl.glShadeModel(GL10.GL_SMOOTH); gl.glEnable(GL10.GL_DEPTH_TEST); gl.glEnableClientState(GL10.GL_VERTEX_ARRAY); gl.glEnableClientState(GL10.GL_COLOR_ARRAY); // load the 3 axis and there colors: float vertices[] = { 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1 }; float colors[] = { 0, 0, 0, 0, 1, 0, 0, 1, 0, 1, 0, 1, 0, 0, 1, 1 }; byte indices[] = { 0, 1, 0, 2, 0, 3 }; ByteBuffer vbb; vbb = ByteBuffer.allocateDirect(vertices.length * 4); vbb.order(ByteOrder.nativeOrder()); vertexBuffer = vbb.asFloatBuffer(); vertexBuffer.put(vertices); vertexBuffer.position(0); vbb = ByteBuffer.allocateDirect(colors.length * 4); vbb.order(ByteOrder.nativeOrder()); colorBuffer = vbb.asFloatBuffer(); colorBuffer.put(colors); colorBuffer.position(0); indexBuffer = ByteBuffer.allocateDirect(indices.length); indexBuffer.put(indices); indexBuffer.position(0); } public void onAccuracyChanged(Sensor sensor, int accuracy) { } public void onSensorChanged(SensorEvent event) { // load the new values: loadNewSensorData(event); if (MODUS == 1) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); } if (MODUS == 2) { rootMeanSquareBuffer(bufferedAccelGData, accelGData); rootMeanSquareBuffer(bufferedMagnetData, magnetData); SensorManager.getRotationMatrix(rotationMatrix, null, bufferedAccelGData, bufferedMagnetData); } if (MODUS == 3) { rootMeanSquareBuffer(bufferedMagnetData, magnetData); SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, bufferedMagnetData); } if (MODUS == 4) { rootMeanSquareBuffer(bufferedAccelGData, accelGData); SensorManager.getRotationMatrix(rotationMatrix, null, bufferedAccelGData, magnetData); } if (MODUS == 5) { // this mode uses the sensor data recieved from the orientation // sensor resultingAngles = orientationData.clone(); if ((-90 > resultingAngles[1]) || (resultingAngles[1] > 90)) { resultingAngles[1] = orientationData[0]; resultingAngles[2] = orientationData[1]; resultingAngles[0] = orientationData[2]; } } if (MODUS == 6) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); final float[] anglesInRadians = new float[3]; SensorManager.getOrientation(rotationMatrix, anglesInRadians); if ((-90 < anglesInRadians[2] * rad2deg) && (anglesInRadians[2] * rad2deg < 90)) { // device camera is looking on the floor // this hemisphere is working fine mirrorOnBlueAxis = false; resultingAngles[0] = anglesInRadians[0] * rad2deg; resultingAngles[1] = anglesInRadians[1] * rad2deg; resultingAngles[2] = anglesInRadians[2] * -rad2deg; } else { mirrorOnBlueAxis = true; // device camera is looking in the sky // this hemisphere is mirrored at the blue axis resultingAngles[0] = (anglesInRadians[0] * rad2deg); resultingAngles[1] = (anglesInRadians[1] * rad2deg); resultingAngles[2] = (anglesInRadians[2] * rad2deg); } } if (MODUS == 7) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); rotationMatrix = transpose(rotationMatrix); /* * this assumes that the rotation matrices are multiplied in x y z * order Rx*Ry*Rz */ resultingAngles[2] = (float) (Math.asin(rotationMatrix[2])); final float cosB = (float) Math.cos(resultingAngles[2]); resultingAngles[2] = resultingAngles[2] * rad2deg; resultingAngles[0] = -(float) (Math.acos(rotationMatrix[0] / cosB)) * rad2deg; resultingAngles[1] = (float) (Math.acos(rotationMatrix[10] / cosB)) * rad2deg; } if (MODUS == 8) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); rotationMatrix = transpose(rotationMatrix); /* * this assumes that the rotation matrices are multiplied in z y x */ resultingAngles[2] = (float) (Math.asin(-rotationMatrix[8])); final float cosB = (float) Math.cos(resultingAngles[2]); resultingAngles[2] = resultingAngles[2] * rad2deg; resultingAngles[1] = (float) (Math.acos(rotationMatrix[9] / cosB)) * rad2deg; resultingAngles[0] = (float) (Math.asin(rotationMatrix[4] / cosB)) * rad2deg; } if (MODUS == 9) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); rotationMatrix = transpose(rotationMatrix); /* * this assumes that the rotation matrices are multiplied in z x y * * note z axis looks good at this one */ resultingAngles[1] = (float) (Math.asin(rotationMatrix[9])); final float minusCosA = -(float) Math.cos(resultingAngles[1]); resultingAngles[1] = resultingAngles[1] * rad2deg; resultingAngles[2] = (float) (Math.asin(rotationMatrix[8] / minusCosA)) * rad2deg; resultingAngles[0] = (float) (Math.asin(rotationMatrix[1] / minusCosA)) * rad2deg; } if (MODUS == 10) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); rotationMatrix = transpose(rotationMatrix); /* * this assumes that the rotation matrices are multiplied in y x z */ resultingAngles[1] = (float) (Math.asin(-rotationMatrix[6])); final float cosA = (float) Math.cos(resultingAngles[1]); resultingAngles[1] = resultingAngles[1] * rad2deg; resultingAngles[2] = (float) (Math.asin(rotationMatrix[2] / cosA)) * rad2deg; resultingAngles[0] = (float) (Math.acos(rotationMatrix[5] / cosA)) * rad2deg; } if (MODUS == 11) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); rotationMatrix = transpose(rotationMatrix); /* * this assumes that the rotation matrices are multiplied in y z x */ resultingAngles[0] = (float) (Math.asin(rotationMatrix[4])); final float cosC = (float) Math.cos(resultingAngles[0]); resultingAngles[0] = resultingAngles[0] * rad2deg; resultingAngles[2] = (float) (Math.acos(rotationMatrix[0] / cosC)) * rad2deg; resultingAngles[1] = (float) (Math.acos(rotationMatrix[5] / cosC)) * rad2deg; } if (MODUS == 12) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); rotationMatrix = transpose(rotationMatrix); /* * this assumes that the rotation matrices are multiplied in x z y */ resultingAngles[0] = (float) (Math.asin(-rotationMatrix[1])); final float cosC = (float) Math.cos(resultingAngles[0]); resultingAngles[0] = resultingAngles[0] * rad2deg; resultingAngles[2] = (float) (Math.acos(rotationMatrix[0] / cosC)) * rad2deg; resultingAngles[1] = (float) (Math.acos(rotationMatrix[5] / cosC)) * rad2deg; } logOutput(); } /** * transposes the matrix because it was transposted (inverted, but here its * the same, because its a rotation matrix) to be used for opengl * * @param source * @return */ private float[] transpose(float[] source) { final float[] result = source.clone(); if (TRY_TRANSPOSED_VERSION) { result[1] = source[4]; result[2] = source[8]; result[4] = source[1]; result[6] = source[9]; result[8] = source[2]; result[9] = source[6]; } // the other values in the matrix are not relevant for rotations return result; } private void rootMeanSquareBuffer(float[] target, float[] values) { final float amplification = 200.0f; float buffer = 20.0f; target[0] += amplification; target[1] += amplification; target[2] += amplification; values[0] += amplification; values[1] += amplification; values[2] += amplification; target[0] = (float) (Math .sqrt((target[0] * target[0] * buffer + values[0] * values[0]) / (1 + buffer))); target[1] = (float) (Math .sqrt((target[1] * target[1] * buffer + values[1] * values[1]) / (1 + buffer))); target[2] = (float) (Math .sqrt((target[2] * target[2] * buffer + values[2] * values[2]) / (1 + buffer))); target[0] -= amplification; target[1] -= amplification; target[2] -= amplification; values[0] -= amplification; values[1] -= amplification; values[2] -= amplification; } private void loadNewSensorData(SensorEvent event) { final int type = event.sensor.getType(); if (type == Sensor.TYPE_ACCELEROMETER) { accelGData = event.values.clone(); } if (type == Sensor.TYPE_MAGNETIC_FIELD) { magnetData = event.values.clone(); } if (type == Sensor.TYPE_ORIENTATION) { orientationData = event.values.clone(); } } private void logOutput() { if (mCount++ > 30) { mCount = 0; Log.d("Compass", "yaw0: " + (int) (resultingAngles[0]) + " pitch1: " + (int) (resultingAngles[1]) + " roll2: " + (int) (resultingAngles[2])); } } }

    Read the article

  • Android: How to bind spinner to custom object list?

    - by niko
    Hi, In the user interface there has to be a spinner which contains some names (the names are visible) and each name has its own ID (the IDs are not equal to display sequence). When the user selects the name from the list the variable currentID has to be changed. The application contains the ArrayList Where User is an object with ID and name: public class User{ public int ID; public String name; } What I don't know is how to create a spinner which displays the list of user's names and bind spinner items to IDs so when the spinner item is selected/changed the variable currentID is set to appropriate value. I would appreciate if anyone could show the solution of the described problem or provide any link useful to solve the problem. Thanks!

    Read the article

  • Android: How to stretch an image to the screen width while maintaining aspect ratio?

    - by fredley
    I want to download an image (of unknown size, but which is always roughly square) and display it so that it fills the screen horizontally, and stretches vertically to maintain the aspect ratio of the image, on any screen size. Here is my (non-working) code. It stretches the image horizontally, but not vertically, so it is squashed... ImageView mainImageView = new ImageView(context); mainImageView.setImageBitmap(mainImage); //downloaded from server mainImageView.setScaleType(ScaleType.FIT_XY); //mainImageView.setAdjustViewBounds(true); //with this line enabled, just scales image down addView(mainImageView,new LinearLayout.LayoutParams( LayoutParams.FILL_PARENT, LayoutParams.FILL_PARENT));

    Read the article

  • Using a SimpleAdapter, how can i map images to a ListAdapter Row? using android sdk

    - by nathanjosiah
    Here is the code that i have for the adapter. ArrayList<HashMap<String, Object>> mylist = new ArrayList<HashMap<String, Object>>(); for (CustomClass imageObject : myArray) { HashMap<String, Object> map = new HashMap<String, Object>(); BmpFromURL myBmp = new BmpFromURL(image.getURL()); map.put("image", myBmp.getMyBitmap()); mylist.add(map); } SimpleAdapter adapter = new SimpleAdapter(this, mylist, R.layout.series_list_row, new String[]{"image"}, new int[]{R.id.ImageView01}); this.setListAdapter(adapter); getMyBitmap() returns a Bitmap.

    Read the article

  • Class Issue (The type XXX is already defined )

    - by Android Stack
    I have listview app exploring cities each row point to diffrent city , in each city activity include button when clicked open new activity which is infinite gallery contains pics of that city , i add infinite gallery to first city and work fine , when i want to add it to the second city , it gave me red mark error in the class as follow : 1- The type InfiniteGalleryAdapter is already defined. 2-The type InfiniteGallery is already defined. i tried to change class name with the same result ,i delete R.jave and eclipse rebuild it with same result also i uncheck the java builder from project properties ,get same red mark error. please any help and advice will be appreciated thanks My Code : public class SecondCity extends Activity { /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); Boolean customTitleSupported = requestWindowFeature(Window.FEATURE_CUSTOM_TITLE); // Set the layout to use setContentView(R.layout.main); if (customTitleSupported) { getWindow().setFeatureInt(Window.FEATURE_CUSTOM_TITLE,R.layout.custom_title); TextView tv = (TextView) findViewById(R.id.tv); Typeface face=Typeface.createFromAsset(getAssets(),"BFantezy.ttf"); tv.setTypeface(face); tv.setText("MY PICTURES"); } InfiniteGallery galleryOne = (InfiniteGallery) findViewById(R.id.galleryOne); galleryOne.setAdapter(new InfiniteGalleryAdapter(this)); } } class InfiniteGalleryAdapter extends BaseAdapter { **//red mark here (InfiniteGalleryAdapter)** private Context mContext; public InfiniteGalleryAdapter(Context c, int[] imageIds) { this.mContext = c; } public int getCount() { return Integer.MAX_VALUE; } public Object getItem(int position) { return position; } public long getItemId(int position) { return position; } private LayoutInflater inflater=null; public InfiniteGalleryAdapter(Context a) { this.mContext = a; inflater = (LayoutInflater)mContext.getSystemService ( Context.LAYOUT_INFLATER_SERVICE); } public class ViewHolder{ public TextView text; public ImageView image; } private int[] images = { R.drawable.pic_1, R.drawable.pic_2, R.drawable.pic_3, R.drawable.pic_4, R.drawable.pic_5 }; private String[] name = { "This is first picture (1) " , "This is second picture (2)", "This is third picture (3)", "This is fourth picture (4)", " This is fifth picture (5)", }; public View getView(int position, View convertView, ViewGroup parent) { ImageView i = getImageView(); int itemPos = (position % images.length); try { i.setImageResource(images[itemPos]); ((BitmapDrawable) i.getDrawable()). setAntiAlias (true); } catch (OutOfMemoryError e) { Log.e("InfiniteGalleryAdapter", "Out of memory creating imageview. Using empty view.", e); } View vi=convertView; ViewHolder holder; if(convertView==null){ vi = inflater.inflate(R.layout.gallery_items, null); holder=new ViewHolder(); holder.text=(TextView)vi.findViewById(R.id.textView1); holder.image=(ImageView)vi.findViewById(R.id.image); vi.setTag(holder); } else holder=(ViewHolder)vi.getTag(); holder.text.setText(name[itemPos]); final int stub_id=images[itemPos]; holder.image.setImageResource(stub_id); return vi; } private ImageView getImageView() { ImageView i = new ImageView(mContext); return i; } } class InfiniteGallery extends Gallery { **//red mark here (InfiniteGallery)** public InfiniteGallery(Context context) { super(context); init(); } public InfiniteGallery(Context context, AttributeSet attrs) { super(context, attrs); init(); } public InfiniteGallery(Context context, AttributeSet attrs, int defStyle) { super(context, attrs, defStyle); init(); } private void init(){ // These are just to make it look pretty setSpacing(25); setHorizontalFadingEdgeEnabled(false); } public void setResourceImages(int[] name){ setAdapter(new InfiniteGalleryAdapter(getContext(), name)); setSelection((getCount() / 2)); } }

    Read the article

  • How do I send DTMF tones and pauses using Android ACTION_CALL Intent with commas in the number?

    - by Rob Kent
    I have an application that calls a number stored by the user. Everything works okay unless the number contains commas or hash signs, in which case the Uri gets truncated after the digits. I have read that you need to encode the hash sign but even doing that, or without a hash sign, the commas never get passed through. However, they do get passed through if you just pick the number from your contacts. I must be doing something wrong. For example: String number = "1234,,,,4#1"; Uri uri = Uri.parse(String.format("tel:%s", number)); try { startActivity(new Intent(callType, uri)); } catch (ActivityNotFoundException e) { ... Only the number '1234' would end up in the dialer.

    Read the article

  • Android: OutofMemoryError: bitmap size exceeds VM budget with no reason I can see.

    - by Meymann
    Hi. I am having an OutOfMemory exception with a gallery over 600x800 pixels JPEG's. The environment I've been using Gallery with JPG images around 600x800 pixels. Since my content may be a bit more complex than just images, I have set each view to be a RelativeLayout that wraps ImageView with the JPG. In order to "speed up" the user experience I have a simple cache of 4 slots that prefetches (in a looper) about 1 image left and 1 image right to the displayed image and keeps them in a 4 slot HashMap. The platform I am using AVD of 256 RAM and 128 Heap Size, with a 600x800 screen. It also happens on an Entourage Edge target, except that with the device it's harder to debug. The problem I have been getting an exception: OutofMemoryError: bitmap size exceeds VM budget And it happens when fetching the fifth image. I have tried to change the size of my image cache, and it is still the same. The strange thing: There should not be a memory problem In order to make sure the heap limit is very far away from what I need, I have defined a dummy 8MB array in the beginning, and left it unreferenced so it's immediately dispatched. It is a member of the activity thread and is defined as following static { @SuppressWarnings("unused") byte dummy[] = new byte[ 8*1024*1024 ]; } The result is that the heap size is nearly 11MB and it's all free. Note I have added that trick after it began to crash. It makes OutOfMemory less frequent. Now, I am using DDMS. Just before the crash (does not change much after the crash), DDMS shows: ID Heap Size Allocated Free %Used #Objects 1 11.195 MB 2.428 MB 8.767 MB 21.69% 47,156 And in the detail table it shows: Type Count Total Size Smallest Largest Median Average free 1,536 8.739MB 16B 7.750MB 24B 5.825KB The largest block is 7.7MB. And yet the LogCat says: ERROR/dalvikvm-heap(1923): 925200-byte external allocation too large for this process. If you mind the relation of the median and the average, it is plausible to assume that most of the available blocks are very small. However, there is a block large enough for the bitmap, it's 7.7M. How come it is still not enough? Note: I recorded a heap trace. When looking at the amount of data allocated, it does not feel like more than 2M is allocated. It does match the free memory report by DDMS. Could it be that I experience some problem like heap-fragmentation? How do I solve/workaround the problem? Is the heap shared to all threads? Could it be that I interpret the DDMS readout in a wrong way, and there is really no 900K block to allocate? If so, can anybody please tell me where I can see that? Thanks a lot Meymann

    Read the article

  • IllegalArgumentException: Service not registered:

    - by Android Dev
    I have multiple activities in my app, evry activity calls bindservice and unbind service method to fetch data. in first activity unbindservice method works fine but in next activity which is also reusing same service where bind service method works fine but unbindservice method gives exception: IllegalArgumentException: Service not registered: Please help on this.

    Read the article

  • how to pass the parameters to the urlconnection in java/android?

    - by androidbase
    hi all, i can establish a connection using HttpUrlConnection. my code below. client = new DefaultHttpClient(); URL action_url = new URL(actionUrl); conn = (HttpURLConnection) action_url.openConnection(); conn.setDoOutput(true); conn.setDoInput(true); conn.setRequestProperty("domain", "bschool.hbs.edu"); conn.setRequestProperty("userType", "2"); conn.setRequestProperty("referer", "http://www.alumni.hbs.edu/"); conn.setRequestProperty("Content-Type", "application/x-www-form-urlencoded"); conn.setRequestMethod(HttpPost.METHOD_NAME); DataOutputStream ds = new DataOutputStream(conn.getOutputStream()); String content = "username=username1&password=password11"; Log.v(TAG, "content: " + content); ds.writeBytes(content); ds.flush(); ds.close(); InputStream in = conn.getInputStream();//**getting filenotfound exception here.** BufferedReader reader = new BufferedReader( new InputStreamReader(in)); StringBuilder str1 = new StringBuilder(); String line = null; while ((line = reader.readLine()) != null) { str1.append(line); Log.v(TAG, "line:" + line); } in.close(); s = str1.toString(); getting filenotfound exception. dont know why? else give me some suggestion to pass username and passwrod parameter to the url by code..

    Read the article

  • Including Android Activities (and their layouts) in JAR files.

    - by fiXedd
    I'm trying to write a library that can be shared as a JAR file. This library will include an Activity and I'd like to include the layout in the JAR. Since it doesn't seem possible to include resource files in a JAR and I don't want the end-users to have to include these files themselves I was thinking it would be a nice hack to include the XML as a String then manually inflate it. Does anyone know if this is possible? Any other ideas?

    Read the article

< Previous Page | 255 256 257 258 259 260 261 262 263 264 265 266  | Next Page >