Search Results

Search found 19831 results on 794 pages for 'pc to android'.

Page 361/794 | < Previous Page | 357 358 359 360 361 362 363 364 365 366 367 368  | Next Page >

  • Does a modern PC require a graphics card to run?

    - by ArtM
    As I can remember, on old systems (Pentium II or III) it was not possible to boot and run the PC if the graphics card was missing (AGP cards were used in those days). Many years from then, I'm using motherboards with integrated graphics and I have no experience related to this subject, the "graphics card" always was present. Currently I intend to build a home/private "server" for my purposes and most of the motherboards I want to buy have no integrated graphics (AMD 870 or 970). I can take a normal graphics card from my firends for a few hours/days and use it when installing the necessary software. The question is: can I boot and run the PC without problems after I install everything I need and the graphics card is removed? if a general answer cannot be given, at least some examples of manufacturers/MB series/MB models will be helpfull I think it's obvious, but for completeness: I mean cheap desktop components, not real servers.

    Read the article

  • Need USB drivers for Nexus One and Mac Snow Leopard?

    - by melling
    I got a AT&T compatible Nexus One that I'm trying to connect to my Mac Book Pro (Snow Leopard) for development. When I do an adb devices, it doesn't appear, and I can't do an adb install either. Until today, I've been using a G1 and I haven't had any issues. I haven't placed a SIM in the phone yet, but I don't think that should matter.

    Read the article

  • Send chrome tab from w7 laptop to w7 HTPC? (Like with iPad to AppleTV)

    - by Justin
    For the last couple days I've been trying to figure out how to get open tabs syncing between chrome installs on different computers to no avail. (if it's supposed to work the way I think it should, that is.) I have a laptop that I do all my web browsing on. Once in a while I'll come across some video that's worthy of the big-screen and the surround sound and want to open that tab (or media) on the HTPC. It'd be nice if I could just 'Right click Send to HTPC' and it opens up there with no further hassle. But even opening chrome on the HTPC and finding all my current tabs waiting would be fine. Alas, open tabs syncing doesn't seem to actually open tabs on other devices for me. Has anyone come up with a way to accomplish anything similar? Thanks all!

    Read the article

  • Virtual Sd-card won't transfer [migrated]

    - by Hyztname
    I have a Verizon Galaxy Nexus 32GB. I've installed AOKP on it and it has been running pretty okay with no bugs etc... But after I connected it too my Nintendo Wii I think it mounted sdcard as FAT32 itself :\ I'm receiving the following error when trying to download something from 4Shared sync app(But actually none can download or transfer anything to Virtual sdcard): XXXXX.apk: open failed: EACCES(Permission denied) I already restored my old backup wiped all data etc, nothing seems to work. Note that I can access my sd-card, i can't only transfer, take pics... basically data. PS: I showed 4sync problem because it was the only one that specify something

    Read the article

  • How would I Enable FTP on Home PC (Win7)?

    - by jp2code
    In my home, all PCs connect through a small router. Some wired and some wireless. Our Media PC (HTPC) is controlled via a Media Center TV style remote, so managing files on it is tedious. I can access all of the files on the HTPC from my desktop PC, but moving a 6 GB file from \\HTPC\Folder1 to \\HTPC\Folder2 involves copying the data to the desktop and then transferring it back. If I were on the HTPC, this would likely be handled by a simple address change for the file (i.e. it would be done almost instantly). I'm thinking if I could get an FTP program to connect to the HTPC, I could simplify things ...but how do I enable that ability on the HPTC? Then, how would I go about connecting to the HTPC? Would I simply enter \\HTPC as the FTP address?

    Read the article

  • What security changes are necessary when connecting DSL modem directly to PC instead of router?

    - by Mike B
    Windows XP I have a user with a single PC that was connected to the internet via a standard home router. The router is now having hardware-related issues and to save money, they're considering connecting the PC directly to the DSL modem since they don't need to share the internet connection or need wireless functionality. If they decide to do that, I'm concerned that this will introduce additional security concerns. Is the Windows Firewall sufficient and Microsoft Security Essentials sufficient for protecting a computer directly connected to a DSL Modem? Or is other security software needed here? Ideally, I'd like to avoid having third-party firewall software constantly bringing up alerts and asking them to approve everything. Also, just to clarify, their use cases are just internet browsing and email.

    Read the article

  • How do I use MediaRecorder to record video without causing a segmentation fault?

    - by rabidsnail
    I'm trying to use android.media.MediaRecorder to record video, and no matter what I do the android runtime segmentation faults when I call prepare(). Here's an example: public void onCreate(Bundle savedInstanceState) { Log.i("video test", "making recorder"); MediaRecorder recorder = new MediaRecorder(); contentResolver = getContentResolver(); try { super.onCreate(savedInstanceState); Log.i("video test", "--------------START----------------"); SurfaceView target_view = new SurfaceView(this); Log.i("video test", "making surface"); Surface target = target_view.getHolder().getSurface(); Log.i("video test", target.toString()); Log.i("video test", "new recorder"); recorder = new MediaRecorder(); Log.i("video test", "set display"); recorder.setPreviewDisplay(target); Log.i("video test", "pushing surface"); setContentView(target_view); Log.i("video test", "set audio source"); recorder.setAudioSource(MediaRecorder.AudioSource.MIC); Log.i("video test", "set video source"); recorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT); Log.i("video test", "set output format"); recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); Log.i("video test", "set audio encoder"); recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); Log.i("video test", "set video encoder"); recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP); Log.i("video test", "set max duration"); recorder.setMaxDuration(3600); Log.i("video test", "set on info listener"); recorder.setOnInfoListener(new listener()); Log.i("video test", "set video size"); recorder.setVideoSize(320, 240); Log.i("video test", "set video frame rate"); recorder.setVideoFrameRate(15); Log.i("video test", "set output file"); recorder.setOutputFile(get_path(this, "foo.3gp")); Log.i("video test", "prepare"); recorder.prepare(); Log.i("video test", "start"); recorder.start(); Log.i("video test", "sleep"); Thread.sleep(3600); Log.i("video test", "stop"); recorder.stop(); Log.i("video test", "release"); recorder.release(); Log.i("video test", "-----------------SUCCESS------------------"); finish(); } catch (Exception e) { Log.i("video test", e.toString()); recorder.reset(); recorder.release(); Log.i("video tets", "-------------------FAIL-------------------"); finish(); } } public static String get_path (Context context, String fname) { String path = context.getFileStreamPath("foo").getParentFile().getAbsolutePath(); String res = path+"/"+fname; Log.i("video test", "path: "+res); return res; } class listener implements MediaRecorder.OnInfoListener { public void onInfo(MediaRecorder recorder, int what, int extra) { Log.i("video test", "Video Info: "+what+", "+extra); } }

    Read the article

  • Launching Intent.ACTION_VIEW intent not working on saved image file

    - by Savvas Dalkitsis
    First of all let me say that this questions is slightly connected to another question by me. Actually it was created because of that. I have the following code to write a bitmap downloaded from the net to a file in the sd card: // Get image from url URL u = new URL(url); HttpGet httpRequest = new HttpGet(u.toURI()); HttpClient httpclient = new DefaultHttpClient(); HttpResponse response = (HttpResponse) httpclient.execute(httpRequest); HttpEntity entity = response.getEntity(); BufferedHttpEntity bufHttpEntity = new BufferedHttpEntity(entity); InputStream instream = bufHttpEntity.getContent(); Bitmap bmImg = BitmapFactory.decodeStream(instream); instream.close(); // Write image to a file in sd card File posterFile = new File(Environment.getExternalStorageDirectory().getAbsolutePath()+"/Android/data/com.myapp/files/image.jpg"); posterFile.createNewFile(); BufferedOutputStream out = new BufferedOutputStream(new FileOutputStream(posterFile)); Bitmap mutable = Bitmap.createScaledBitmap(bmImg,bmImg.getWidth(),bmImg.getHeight(),true); mutable.compress(Bitmap.CompressFormat.JPEG, 100, out); out.flush(); out.close(); // Launch default viewer for the file Intent intent = new Intent(); intent.setAction(android.content.Intent.ACTION_VIEW); intent.setDataAndType(Uri.parse(posterFile.getAbsolutePath()),"image/*"); ((Activity) getContext()).startActivity(intent); A few notes. I am creating the "mutable" bitmap after seeing someone using it and it seems to work better than without it. And i am using the parse method on the Uri class and not the fromFile because in my code i am calling these in different places and when i am creating the intent i have a string path instead of a file. Now for my problem. The file gets created. The intent launches a dialog asking me to select a viewer. I have 3 viewers installed. The Astro image viewer, the default media gallery (i have a milstone on 2.1 but on the milestone the 2.1 update did not include the 3d gallery so it's the old one) and the 3d gallery from the nexus one (i found the apk in the wild). Now when i launch the 3 viewers the following happen: Astro image viewer: The activity launches but i see nothing but a black screen. Media Gallery: i get an exception dialog shown "The application Media Gallery (process com.motorola.gallery) has stopped unexpectedly. Please try again" with a force close option. 3D gallery: Everything works as it should. When i try to simply open the file using the Astro file manager (browse to it and simply click) i get the same option dialog but this time things are different: Astro image viewer: Everything works as it should. Media Gallery: Everything works as it should. 3D gallery: The activity launches but i see nothing but a black screen. As you can see everything is a complete mess. I have no idea why this happens but it happens like this every single time. It's not a random bug. Am i missing something when i am creating the intent? Or when i am creating the image file? Any ideas?

    Read the article

  • Peer did not return a certificate

    - by pfista
    I am trying to get two way SSL authentication working between a Python server and an Android client application. I have access to both the server and client, and would like to implement client authentication using my own certificate. So far I have been able to verify the server certificate and connect without client authentication. What sort of certificate does the client need and how do I get it to automatically send it to the server during the handshake process? Here is the client and server side code that I have so far. Is my approach wrong? Server Code while True: # Keep listening for clients c, fromaddr = sock.accept() ssl_sock = ssl.wrap_socket(c, keyfile = "serverPrivateKey.pem", certfile = "servercert.pem", server_side = True, # Require the client to provide a certificate cert_reqs = ssl.CERT_REQUIRED, ssl_version = ssl.PROTOCOL_TLSv1, ca_certs = "clientcert.pem", #TODO must point to a file of CA certificates?? do_handshake_on_connect = True, ciphers="!NULL:!EXPORT:AES256-SHA") print ssl_sock.cipher() thrd = sock_thread(ssl_sock) thrd.daemon = True thrd.start() I suspect I may be using the wrong file for ca_certs...? Client Code private boolean connect() { try { KeyStore keystore = KeyStore.getInstance("BKS"); // Stores the client certificate, to be sent to server KeyStore truststore = KeyStore.getInstance("BKS"); // Stores the server certificate we want to trust // TODO: change hard coded password... THIS IS REAL BAD MKAY truststore.load(mSocketService.getResources().openRawResource(R.raw.truststore), "test".toCharArray()); keystore.load(mSocketService.getResources().openRawResource(R.raw.keystore), "test".toCharArray()); // Use the key manager for client authentication. Keys in the key manager will be sent to the host KeyManagerFactory keyFManager = KeyManagerFactory.getInstance(KeyManagerFactory.getDefaultAlgorithm()); keyFManager.init(keystore, "test".toCharArray()); // Use the trust manager to determine if the host I am connecting to is a trusted host TrustManagerFactory trustMFactory = TrustManagerFactory.getInstance(TrustManagerFactory .getDefaultAlgorithm()); trustMFactory.init(truststore); // Create the socket factory and add both the trust manager and key manager SSLCertificateSocketFactory socketFactory = (SSLCertificateSocketFactory) SSLCertificateSocketFactory .getDefault(5000, new SSLSessionCache(mSocketService)); socketFactory.setTrustManagers(trustMFactory.getTrustManagers()); socketFactory.setKeyManagers(keyFManager.getKeyManagers()); // Open SSL socket directly to host, host name verification is NOT performed here due to // SSLCertificateFactory implementation mSSLSocket = (SSLSocket) socketFactory.createSocket(mHostname, mPort); mSSLSocket.setSoTimeout(TIMEOUT); // Most SSLSocketFactory implementations do not verify the server's identity, allowing man-in-the-middle // attacks. This implementation (SSLCertificateSocketFactory) does check the server's certificate hostname, // but only for createSocket variants that specify a hostname. When using methods that use InetAddress or // which return an unconnected socket, you MUST verify the server's identity yourself to ensure a secure // connection. verifyHostname(); // Safe to proceed with socket now ... I have generated a client private key, a client certificate, a server private key, and a server certificate using openssl. I then added the client certificate to keystore.bks (which I store in /res/raw/keystore.bks) I then added the server certificate to the truststore.bks So now when the client tries to connect I am getting this error server side: ssl.SSLError: [Errno 1] _ssl.c:504: error:140890C7:SSL routines:SSL3_GET_CLIENT_CERTIFICATE:peer did not return a certificate And when I try to do this in the android client SSLSession s = mSSLSocket.getSession(); s.getPeerCertificates(); I get this error: javax.net.ssl.SSLPeerUnverifiedException: No peer certificate So obviously the keystore I am using doesn't appear to have a correct peer certificate in it and thus isn't sending one to the server. What should I put in the keystore to prevent this exception? Furthermore, is this method of two way SSL authentication safe and effective?

    Read the article

  • mediaplayer failure exception

    - by Rahulkapil
    I am working on an android application in which i have to play random sounds from my assets folder. there are some images also, when i click on any image from those images a sound must play regarding to that image from assets folder. i managed all but sometime my mediaplayer fails unexpectedly. I am attaching my code also. private Handler threadHandler = new Handler() { public void handleMessage(android.os.Message msg) { /*first*/ try{ InputStream ims1 = getAssets().open("images/" +dataAll_pic_name1); d1 = Drawable.createFromStream(ims1, null); rl1.setVisibility(View.VISIBLE); img1.setImageDrawable(d1); AssetFileDescriptor afd = getAssets().openFd("sounds/" + str_snd1); mp2 = new MediaPlayer(); mp2.setDataSource(afd.getFileDescriptor(),afd.getStartOffset(),afd.getLength()); mp2.prepare(); mp2.start(); mp2.setOnCompletionListener(new OnCompletionListener() { @Override public void onCompletion(MediaPlayer mp) { /*second*/ try{ InputStream ims2 = getAssets().open("images/" +dataAll_pic_name2); d2 = Drawable.createFromStream(ims2, null); rl2.setVisibility(View.VISIBLE); img2.setImageDrawable(d2); AssetFileDescriptor afd = getAssets().openFd("sounds/" + str_snd2); mp2 = new MediaPlayer(); mp2.setDataSource(afd.getFileDescriptor(),afd.getStartOffset(),afd.getLength()); mp2.prepare(); mp2.start(); mp2.setOnCompletionListener(new OnCompletionListener() { @Override public void onCompletion(MediaPlayer mp) { /*third*/ try{ InputStream ims3 = getAssets().open("images/" +dataAll_pic_name3); d3 = Drawable.createFromStream(ims3, null); rl3.setVisibility(View.VISIBLE); img3.setImageDrawable(d3); AssetFileDescriptor afd = getAssets().openFd("sounds/" + str_snd3); mp2 = new MediaPlayer(); mp2.setDataSource(afd.getFileDescriptor(),afd.getStartOffset(),afd.getLength()); mp2.prepare(); mp2.start(); mp2.setOnCompletionListener(new OnCompletionListener() { @Override public void onCompletion(MediaPlayer mp) { /*four*/ try{ InputStream ims4 = getAssets().open("images/" +dataAll_pic_name4); d4 = Drawable.createFromStream(ims4, null); rl4.setVisibility(View.VISIBLE); img4.setImageDrawable(d4); AssetFileDescriptor afd = getAssets().openFd("sounds/" + str_snd4); mp2 = new MediaPlayer(); mp2.setDataSource(afd.getFileDescriptor(),afd.getStartOffset(),afd.getLength()); mp2.prepare(); mp2.start(); mp2.setOnCompletionListener(new OnCompletionListener() { @Override public void onCompletion(MediaPlayer mp) { startAnimation(); //randomSoundPlay(); timer.schedule( new TimerTask(){ public void run() { System.out.println("Wait, what........................:"); try{ AssetFileDescriptor afd = getAssets().openFd("sounds/" + dataAll_sound_name); mp2 = new MediaPlayer(); mp2.setDataSource(afd.getFileDescriptor(),afd.getStartOffset(),afd.getLength()); mp2.prepare(); mp2.start(); mp2.setOnCompletionListener(new OnCompletionListener() { @Override public void onCompletion(MediaPlayer mp) { vg1.setClickable(true); vg2.setClickable(true); vg3.setClickable(true); vg4.setClickable(true); btn_spkr.setVisibility(View.VISIBLE); txtImage(); } }); }catch(Exception e){ e.printStackTrace(); } } }, delay_que); } }); }catch(Exception e){ e.printStackTrace(); } } }); }catch(Exception e){ e.printStackTrace(); } } }); }catch(Exception e){ e.printStackTrace(); } } }); }catch(Exception e){ e.printStackTrace(); } } }; in above code random images and sound sets in my activity. now when i click on any image a sound must play but sometimes it fails.. i tried but unable to resolve this issue. help me out. thanks in advance.

    Read the article

  • can't get texture to work

    - by user583713
    It been a while since I use android opengl but for what ever reason I get a white squre box and not the texture I what on the screen. Oh I do not think this would matter but just in case I put a linerlayout view first then the surfaceview on but anyway Here my code: public class GameEngine { private float vertices[]; private float textureUV[]; private int[] textureId = new int[1]; private FloatBuffer vertextBuffer; private FloatBuffer textureBuffer; private short indices[] = {0,1,2,2,1,3}; private ShortBuffer indexBuffer; private float x, y, z; private float rot, rotX, rotY, rotZ; public GameEngine() { } public void setEngine(float x, float y, float vertices[]){ this.x = x; this.y = y; this.vertices = vertices; ByteBuffer vbb = ByteBuffer.allocateDirect(this.vertices.length * 4); vbb.order(ByteOrder.nativeOrder()); vertextBuffer = vbb.asFloatBuffer(); vertextBuffer.put(this.vertices); vertextBuffer.position(0); vertextBuffer.clear(); ByteBuffer ibb = ByteBuffer.allocateDirect(indices.length * 2); ibb.order(ByteOrder.nativeOrder()); indexBuffer = ibb.asShortBuffer(); indexBuffer.put(indices); indexBuffer.position(0); indexBuffer.clear(); } public void draw(GL10 gl){ gl.glLoadIdentity(); gl.glTranslatef(x, y, z); gl.glRotatef(rot, rotX, rotY, rotZ); gl.glBindTexture(GL10.GL_TEXTURE_2D, textureId[0]); gl.glEnableClientState(GL10.GL_VERTEX_ARRAY); gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY); gl.glVertexPointer(2, GL10.GL_FLOAT, 0, vertextBuffer); gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer); gl.glDrawElements(GL10.GL_TRIANGLES, indices.length, GL10.GL_UNSIGNED_SHORT, indexBuffer); gl.glDisableClientState(GL10.GL_VERTEX_ARRAY); gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY); } public void LoadTexture(float textureUV[], GL10 gl, InputStream is) throws IOException{ this.textureUV = textureUV; ByteBuffer tbb = ByteBuffer.allocateDirect(this.textureUV.length * 4); tbb.order(ByteOrder.nativeOrder()); textureBuffer = tbb.asFloatBuffer(); textureBuffer.put(this.textureUV); textureBuffer.position(0); textureBuffer.clear(); Bitmap bitmap = null; try{ bitmap = BitmapFactory.decodeStream(is); }finally{ try{ is.close(); is = null; gl.glGenTextures(1, textureId,0); gl.glBindTexture(GL10.GL_TEXTURE_2D, textureId[0]); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR); //Different possible texture parameters, e.g. GL10.GL_CLAMP_TO_EDGE gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, GL10.GL_REPEAT); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, GL10.GL_REPEAT); //Use the Android GLUtils to specify a two-dimensional texture image from our bitmap GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bitmap, 0); gl.glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE_MINUS_SRC_ALPHA); //Clean up gl.glBindTexture(GL10.GL_TEXTURE_2D, textureId[0]); bitmap.recycle(); }catch(IOException e){ } } } public void setVector(float x, float y, float z){ this.x = x; this.y = y; this.z = z; } public void setRot(float rot, float x, float y, float z){ this.rot = rot; this.rotX = x; this.rotY = y; this.rotZ = z; } public float getZ(){ return z; } }

    Read the article

  • Events do not propagate from a child element?

    - by Legend
    I was playing around with the Swipe jQuery plugin on my iPod Touch and Android devices. The following works: <html> <head> <script type="text/javascript" src="lib/jquery/jquery-1.3.2.js"></script> <script type="text/javascript" src="lib/plugins/jquery.swipe.js"></script> <script type="text/javascript"> $(function() { $('body').swipe({ swipeLeft: function() { $('#container1').append("Swiped Left!"); }, swipeRight: function() { $('#container2').append("Swiped Right!");} }); }); </script> <style type="text/javascript"> body {width: 300px; height: 300px; background: #000;} </style> </head> <body> <div id="container1"> This is container one </div> <div id="container2"> This is container two </div> </body> </html> But if I have something like this: <html> <head> <script type="text/javascript" src="lib/jquery/jquery-1.3.2.js"></script> <script type="text/javascript" src="lib/plugins/jquery.swipe.js"></script> <script type="text/javascript"> $(function() { $('#wrapper').swipe({ swipeLeft: function() { $('#container1').append("Swiped Left!"); }, swipeRight: function() { $('#container2').append("Swiped Right!");} }); }); </script> <style type="text/javascript"> body {width: 300px; height: 300px; background: #000;} </style> </head> <body> <div id="wrapper"> <div id="container1"> This is container one </div> <div id="container2"> This is container two </div> </div> </body> </html> Notice the "wrapper" div around the containers. Now, when I swipe on the div element, I was expecting it to actually trigger the event. This works in iPod touch as expected but does not work on my Android device unless I randomly start swiping everywhere until I happen to swipe on that small wrapper div element itself. I am not sure how to explain this but hink of it as sending events to the wrapper div itself. Both use the WebKit engine. Can someone tell me if I am doing something wrong?

    Read the article

  • Eclipse w/ ADT, Save Failed Illegal Value -1

    - by bgenchel
    I'm trying to save and run my android project in eclipse, but every time I do, I get a pop up with the following message/error: Save Failed, Illegal Value -1. I'm not sure why i'm getting this error; I did just make some changes but none of them were extreme or major and it was working just a bit ago. The worst part is that i'm not receiving any kind of pointer, console or logcat, towards what is causing it. Has anyone experienced this before? If so, what can I do? Thanks in advance!

    Read the article

  • soapfault: Couldn't create SOAP message

    - by polarw
    11-23 16:19:30.085: SoapFault - faultcode: 'S:Client' faultstring: 'Couldn't create SOAP message due to exception: Unable to create StAX reader or writer' faultactor: 'null' detail: null 11-23 16:19:30.085: at org.ksoap2.serialization.SoapSerializationEnvelope.parseBody(SoapSerializationEnvelope.java:121) 11-23 16:19:30.085: at org.ksoap2.SoapEnvelope.parse(SoapEnvelope.java:137) 11-23 16:19:30.085: at org.ksoap2.transport.Transport.parseResponse(Transport.java:63) 11-23 16:19:30.085: at org.ksoap2.transport.HttpTransportSE.call(HttpTransportSE.java:104) 11-23 16:19:30.085: at com.mobilebox.webservice.CommonWSClient.callWS(CommonWSClient.java:247) 11-23 16:19:30.085: at com.mobilebox.webservice.CommonWSClient.access$1(CommonWSClient.java:217) 11-23 16:19:30.085: at com.mobilebox.webservice.CommonWSClient$WSHandle.run(CommonWSClient.java:201) 11-23 16:19:30.085: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1088) 11-23 16:19:30.085: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:581) 11-23 16:19:30.085: at java.lang.Thread.run(Thread.java:1019) My Android application use Soap webservice client to call remote method. Sometimes, it will return the excepion as above. When I call it with SoapUI, it never occours.

    Read the article

  • How to implement a custom AlertDialog View

    - by stormin986
    In the Android docs on AlertDialog, it gives the following instruction and example for setting a custom view in an AlertDialog: If you want to display a more complex view, look up the FrameLayout called "body" and add your view to it: FrameLayout fl = (FrameLayout) findViewById(R.id.body); fl.add(myView, new LayoutParams(FILL_PARENT, WRAP_CONTENT)); First off, it's pretty obvious that add() is a typo and is meant to be addView(). I'm confused by the first line using R.id.body. It seems that it's the body element of the AlertDialog ... but I can't just enter that in my code b/c it gives a compile error. Where does R.id.body get defined or assigned or whatever?

    Read the article

  • How can I place a SurfaceView inside one or more scrollviews ?

    - by Mervin
    Hello, I'm making a 2D game for Android and I've encountered the following problem: - I have made a class that extends SurfaceView and draws my graphics (the size of the canvas that I use has to be 1664x864). This is ofcourse too big for the screen so I needed scrolling, I chose to use google's ScrollView and HorizontalScrollView (nested) for this, But whenever I add my SurfaceView to a ScrollView (whether it's 1 or 2) via AddChild it only draws the ScrollView , SurfaceCreated() isn't even called on the SurfaceView. (Drawing the SurfaceView in a layout without adding it to a ScrollView does work.) I realize that there are other options for scrolling like a screen-sized canvas and bitmap offsets but this would really be my preferred way to go. I would really appreciate some help to make this work.

    Read the article

  • Ndk-build: CreateProcess: make (e=87): The parameter is incorrect

    - by user1514958
    I get an error when build static lib with NDK on Windows platform: process_begin: CreateProcess( "PATH"\android-ndk-r8b\toolchains\arm-linux-androideabi-4.6\prebuilt\windows\bin\arm-linux-androideabi-ar.exe, "some other commands" ) failed. make (e=87): The parameter is incorrect. make: *** [obj/local/armeabi-v7a/staticlib.a] Error 87 make: *** Waiting for unfinished jobs.... All source files build successfully, and this error occur when compose object files. I don't get this error when build this project in Ubuntu, it occur only on Windows. I suppose I found the issue: second parameter of CreateProcess Win API function lpCommandLine has max length 32,768 characters. But in my case it is more than 32,768 characters. How I can solve this issue?

    Read the article

  • how mount userdata.img or userdata-qemu.img in osx

    - by misbell
    Disk Utility in OSX easily mounts an SD Card image as a device, but not so the other img files. I want to get the database I just created in the Android Emulator off the drive and into my osx file system. I updated my system with qemu using macports but no combination I try succeeds. Anyone figured out how to do this? Obviously one way I can do this is run the app on my phone than mount the phone as a USB drive. But I don't wanna. I wanna get it off the drive the emulator uses :-) Thanks in advance, folks. Michael

    Read the article

  • Handle existing instance of root activity when launching root activity again from intent filter

    - by Robert
    Hi, I'm having difficulties handling multiple instances of my root (main) activity for my application. My app in question has an intent filter in place to launch my application when opening an email attatchment from the "Email" app. My problem is if I launch my application first through the the android applications screen and then launch my application via opening the Email attachment it creates two instances of my root activity. steps: Launch root activity A, press home Open email attachment, intent filter triggers launches root activity A Is it possible when opening the Email attachment that when the OS tries to launch my application it detects there is already an instance of it running and use that or remove/clear that instance?

    Read the article

  • How do I parse JSON from a Java HTTPResponse?

    - by Joe Ludwig
    I have an HttpResponse object for a web request I just made. The response is in the JSON format, so I need to parse it. I can do it in an absurdly complex way, but it seems like there must be a better way. Is this really the best I can do? HttpResponse response; // some response object Reader in = new BufferedReader( new InputStreamReader(response.getEntity().getContent(), "UTF-8")); StringBuilder builder= new StringBuilder(); char[] buf = new char[1000]; int l = 0; while (l >= 0) { builder.append(buf, 0, l); l = in.read(buf); } JSONTokener tokener = new JSONTokener( builder.toString() ); JSONArray finalResult = new JSONArray( tokener ); I'm on Android if that makes any difference.

    Read the article

< Previous Page | 357 358 359 360 361 362 363 364 365 366 367 368  | Next Page >