Search Results

Search found 12035 results on 482 pages for 'android emulator'.

Page 337/482 | < Previous Page | 333 334 335 336 337 338 339 340 341 342 343 344  | Next Page >

  • Java Socket Connection is flooding network OR resulting in high ping

    - by user1461100
    i have a little problem with my java socket code. I'm writing an android client application which is sending data to a java multithreaded socket server on my pc through direct(!) wireless connection. It works fine but i want to improve it for mobile applications as it is very power consuming by now. When i remove two special lines in my code, the cpu usage of my mobile device (htc one x) is totally okay but then my connection seems to have high ping rates or something like that... Here is a server code snippet where i receive the clients data: while(true) { try { .... Object obj = in.readObject(); if(obj != null) { Class clazz = obj.getClass(); String className = clazz.getName(); if(className.equals("java.lang.String")) { String cmd = (String)obj; if(cmd.equals("dc")) { System.out.println("Client "+id+" disconnected!"); Server.connectedClients[id-1] = false; break; } if(cmd.substring(0,1).equals("!")) { robot.keyRelease(PlayerEnum.getKey(cmd,id)); } else { robot.keyPress(PlayerEnum.getKey(cmd,id)); } } } } catch .... Heres the client part, where i send my data in a while loop: private void networking() { try { if(client != null) { .... out.writeObject(sendQueue.poll()); .... } } catch .... when i write it this why, i send data everytime the while loop gets executed.. when sendQueue is empty, a null "Object" will be send. this results in "high" network traffic and in "high" cpu usage. BUT: all send comments are received nearly immediately. when i change the code to following: while(true) ... if(sendQueue.peek() != null) { out.writeObject(sendQueue.poll()); } ... the cpu usage is totally okay but i'm getting some laggs.. the commands do not arrive fast enough.. as i said, it works fine (besides cpu usage) if i'm sending data(with that null objects) every while execution. but i'm sure that this is very rough coding style because i'm kind of flooding the network. any hints? what am i doing wrong?? Thanks for your Help! Sincerly yours, maaft

    Read the article

  • Specifying a Single Request To Use Credentials with HttpClient

    - by jiduvah
    I am using OAuth2 on my android project. The idea is to use a singleton HttpClient used with a ThreadSafeClientConnManager. For a normal request to the server we construct an Authorization header and send that. The header is constructed from values received from the server. This works fine. However every 15 minutes we must get new values from the server to construct the header. To Received these values I must set the credentials like so. client.getCredentialsProvider().setCredentials( new AuthScope(AuthScope.ANY_HOST, AuthScope.ANY_PORT), new UsernamePasswordCredentials(creds.clientId, creds.clientSecret)); In order for this to work I must set up and new DefaultHttpClient. If I use the original singleton httpclient I receive some errors. My question is.. is it possible to set the credentials to be used only on this one request? I noticed that there is an AuthScope. The host and port would not be suitable for this but maybe the realm would? I can't find anything that tells me what a realm is or how to use it. 06-05 10:12:55.969: W/System.err(23843): org.apache.http.NoHttpResponseException: The target server failed to respond 06-05 10:12:55.969: W/System.err(23843): at org.apache.http.impl.conn.DefaultResponseParser.parseHead(DefaultResponseParser.java:85) 06-05 10:12:55.969: W/System.err(23843): at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:174) 06-05 10:12:55.969: W/System.err(23843): at org.apache.http.impl.AbstractHttpClientConnection.receiveResponseHeader(AbstractHttpClientConnection.java:179) 06-05 10:12:55.969: W/System.err(23843): at org.apache.http.impl.conn.DefaultClientConnection.receiveResponseHeader(DefaultClientConnection.java:235) 06-05 10:12:55.969: W/System.err(23843): at org.apache.http.impl.conn.AbstractClientConnAdapter.receiveResponseHeader(AbstractClientConnAdapter.java:259) 06-05 10:12:55.975: W/System.err(23843): at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:279) 06-05 10:12:55.975: W/System.err(23843): at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:121) 06-05 10:12:55.975: W/System.err(23843): at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:504) 06-05 10:12:55.975: W/System.err(23843): at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:555) 06-05 10:12:55.975: W/System.err(23843): at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:487) 06-05 10:12:55.975: W/System.err(23843): at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:465) So After more testing I have found where the problem lies. I want to configure a pooled connection manager like so SchemeRegistry schemeRegistry = new SchemeRegistry(); schemeRegistry.register( new Scheme("http", PlainSocketFactory.getSocketFactory(), 80)); schemeRegistry.register( new Scheme("https", PlainSocketFactory.getSocketFactory(), 443)); ClientConnectionManager conManager = new ThreadSafeClientConnManager(new BasicHttpParams(), schemeRegistry); DefaultHttpClient httpClient = new DefaultHttpClient(); But when configure like this, I get the error above. If I use the normal default httpclient like so DefaultHttpClient httpClient = new DefaultHttpClient(); Then it works fine. Any ideas?

    Read the article

  • Joda Time cannot subtract one hour

    - by Leoa
    In my android program, I have a spinner that allows the user to select different times. Each selection is processed with Joda time to subtract the minutes. It works fine for minutes 0 to 59 and 61 and greater. However, when 60 minutes is subtracted, the time is not updated, and the original time is shown. How do I get Joda time to subtract 60 minutes? Spinner: public class MyOnItemSelectedListener implements OnItemSelectedListener { public void onItemSelected(AdapterView<?> parent, View view, int pos, long id1) { String mins = parent.getItemAtPosition(pos).toString(); int intmins=0; // process user's selection of alert time if(mins.equals("5 minutes")){intmins = 5;} if(mins.equals("10 minutes")){intmins = 10;} if(mins.equals("20 minutes")){intmins = 20;} if(mins.equals("30 minutes")){intmins = 30;} if(mins.equals("40 minutes")){intmins = 40;} if(mins.equals("50 minutes")){intmins = 50;} if(mins.equals("60 minutes")){intmins = 60;} if(mins.equals("120 minutes")){intmins = 120;} String stringMinutes=""+intmins; setAlarm(intmins, stringMinutes); } else { } public void onNothingSelected(AdapterView parent) { mLocationDisplay.setText(" " + location); } } public void setAlarm(int intmins, String mins) { // based alarm time on start time of event. TODO get info from database. String currentDate; SimpleDateFormat myFormat = new SimpleDateFormat("MM/dd/yyyy HH:mm:ss"); Date date1 = null; DateTime dt; currentDate = eventdate + " " + startTimeMilitary;// startTimeMilitary; try { date1 = myFormat.parse(currentDate); } catch (ParseException e) { // TODO Auto-generated catch block e.printStackTrace(); } dt = new DateTime(date1); long dateInMillis = dt.getMillis(); String sDateInMillis = Long.toString(dateInMillis); // subtract the selected time from the event's start time String newAlertTime = subtractTime(dt, intmins); newAlertTime = subtractTime(dt, intmins); //......}

    Read the article

  • Blending pixels from Two Bitmaps

    - by MarkPowell
    I'm beating my head against a wall here, and I'm fairly certain I'm doing something stupid, so time to make my stupidity public. I'm trying to take two images, blend them together into a third image using standard blending algorithms (Hardlight, softlight, overlay, multiply, etc). Because Android does not have such blend properties build in, I've gone down the path of taking each pixel and combine them using an algorithm. However, the results are garbage. Any help would be appreciated. Below is the code, which I've tried to strip out all the "junk", but some may have made it through. I'll clean it up if something isn't clear. Bitmap src = BitmapFactory.decodeResource(getResources(), R.drawable.base, options); Bitmap mutableBitmap = src.copy(Bitmap.Config.RGB_565, true); int imageId = getResources().getIdentifier("drawable/" + filter, null, getPackageName()); Bitmap filterBitmap = BitmapFactory.decodeResource(getResources(), imageId, options); float scaleWidth = ((float) mutableBitmap.getWidth()) / filterBitmap.getWidth(); float scaleHeight = ((float) mutableBitmap.getHeight()) / filterBitmap.getHeight(); IntBuffer buffSrc = IntBuffer.allocate(src.getWidth() * src.getHeight()); mutableBitmap.copyPixelsToBuffer(buffSrc); buffSrc.rewind(); IntBuffer buffFilter = IntBuffer.allocate(resizedFilterBitmap.getWidth() * resizedFilterBitmap.getHeight()); resizedFilterBitmap.copyPixelsToBuffer(buffFilter); buffFilter.rewind(); IntBuffer buffOut = IntBuffer.allocate(src.getWidth() * src.getHeight()); buffOut.rewind(); while (buffOut.position() < buffOut.limit()) { int filterInt = buffFilter.get(); int srcInt = buffSrc.get(); int alphaValueFilter = Color.alpha(filterInt); int redValueFilter = Color.red(filterInt); int greenValueFilter = Color.green(filterInt); int blueValueFilter = Color.blue(filterInt); int alphaValueSrc = Color.alpha(srcInt); int redValueSrc = Color.red(srcInt); int greenValueSrc = Color.green(srcInt); int blueValueSrc = Color.blue(srcInt); int alphaValueFinal = convert(alphaValueFilter, alphaValueSrc); int redValueFinal = convert(redValueFilter, redValueSrc); int greenValueFinal = convert(greenValueFilter, greenValueSrc); int blueValueFinal = convert(blueValueFilter, blueValueSrc); int pixel = Color.argb(alphaValueFinal, redValueFinal, greenValueFinal, blueValueFinal); buffOut.put(pixel); } buffOut.rewind(); mutableBitmap.copyPixelsFromBuffer(buffOut); BitmapDrawable drawable = new BitmapDrawable(getResources(), mutableBitmap); imageView.setImageDrawable(drawable); } int convert (int in1, int in2) { //simple multiply for example return in1 * in2 / 255; }

    Read the article

  • OpenGL, how to set a monochrome texture to a colored shape?

    - by Santiago
    I'm developing on Android with OpenGL ES, I draw some cubes and I change its colors with glColor4f. Now, what I want is to give a more realistic effect on the cubes, so I create a monochromatic 8bit depth, 64x64 pixel size PNG file. I loaded on a texture, and here is my problem, witch is the way to combine the color and the texture to get a colorized and textured cubes onto the screen? I'm not an expert on OpenGL, I tried this: On create: public void asignBitmap(GL10 gl, Bitmap bitmap) { int[] textures = new int[1]; gl.glGenTextures(1, textures, 0); mTexture = textures[0]; gl.glBindTexture(GL10.GL_TEXTURE_2D, mTexture); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE); gl.glTexEnvf(GL10.GL_TEXTURE_ENV, GL10.GL_TEXTURE_ENV_MODE, GL10.GL_REPLACE); GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_ALPHA, bitmap, 0); ByteBuffer tbb = ByteBuffer.allocateDirect(texCoords.length * 4); tbb.order(ByteOrder.nativeOrder()); mTexBuffer = tbb.asFloatBuffer(); for (int i = 0; i < 48; i++) mTexBuffer.put(texCoords[i]); mTexBuffer.position(0); } And OnDraw: public void draw(GL10 gl, int alphawires) { gl.glColor4f(1.0f, 0.0f, 0.0f, 0.5f); //RED gl.glBindTexture(GL10.GL_TEXTURE_2D, mTexture); gl.glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE_MINUS_SRC_ALPHA); gl.glEnable(GL10.GL_TEXTURE_2D); gl.glEnable(GL10.GL_BLEND); gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY); gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, mTexBuffer); //Set the face rotation gl.glFrontFace(GL10.GL_CW); //Point to our buffers gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer); //Enable the vertex and color state gl.glEnableClientState(GL10.GL_VERTEX_ARRAY); //Draw the vertices as triangles, based on the Index Buffer information gl.glDrawElements(GL10.GL_TRIANGLES, 36, GL10.GL_UNSIGNED_BYTE, indexBuffer); //Disable the client state before leaving gl.glDisableClientState(GL10.GL_VERTEX_ARRAY); gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY); gl.glDisable(GL10.GL_BLEND); gl.glDisable(GL10.GL_TEXTURE_2D); } I'm even not sure if I have to use a blend option, because I don't need transparency, but is a plus :)

    Read the article

  • Extending AdapterView

    - by Ander Webbs
    Hi, i'm trying to make (for learning purposes) my own implementation of a simple AdapterView where items comes from an basic Adapter (ImageAdapter from sdk samples). Actual code is like this: public class MyAdapterView extends AdapterView<ImageAdapter> implements AdapterView.OnItemClickListener{ private ImageAdapter mAdapter; public MyAdapterView(Context context, AttributeSet attrs, int defStyle) { super(context, attrs, defStyle); initThings(); } private void initThings(){ setOnItemClickListener(this); } @Override public ImageAdapter getAdapter() { // TODO Auto-generated method stub return mAdapter; } @Override public void setAdapter(ImageAdapter adapter) { // TODO Auto-generated method stub mAdapter=adapter; requestLayout(); } View obtainView(int position) { View child = mAdapter.getView(position, null, this); return child; } @Override protected void onLayout(boolean changed, int l, int t, int r, int b) { super.onLayout(changed, l, t, r, b); for(int i=0;i<mAdapter.getCount();i++){ View child = obtainView(i); child.layout(10, 70*i, 70, 70); addViewInLayout(child, i, null, true); } this.invalidate(); } @Override public void onItemClick(AdapterView<?> parent, View v, int position, long id) { Log.d("MYEXAMPLES","Clicked an item!"); } } This isn't a coding masterpiece, it just displays a pseudo-listview with pictures. I know i could've used ListView, GridView, Spinner, etc. but i'm relative new to android and i'm trying to figure out some things on it. Well, the question here is: Why is my onItemClick not firing? Using the same ImageAdapter with a GridView, everything works ok, but when i use with above class, i get nothing. Inside AdapterView.java there is code for those click, longclick, etc events... so why can't i just fire them? Maybe i'm misunderstanding basic things on how AdapterView works? Should I extend other base classes instead? And why? Hoping to find more experienced guidance on here, thanks in advance.

    Read the article

  • How to make html image clickable inside a TextView

    - by Gonan
    I have the following text in a string in the resources file: <a href="mailto:[email protected]">&lt;img src="mail_big" /&gt;</a> It shows the image fine (I implemented ImageGetter) but it is not clickable. I have tried adding the Linkify thingy but I don't think it's meant for this case, and so it doesn't work. The setMovementMethod doesn't work either. I have tried different combinations of the above: <a href="mailto:[email protected]">&lt;img src="mail_big" /&gt;hello</a> Here, even the "hello" part is not clickable (neither blue nor underlined). <a href="mailto:[email protected]"><img src="mail_big" /></a> This doesn't even show the image. &lt;a href="mailto:[email protected]"&gt;&lt;img src="mail_big" /&gt;&lt;/a&gt; If I just write the email, without the <a> tag it works perfectly, but I would like to use the image of an envelope that the user can click on. It's not possible to use an imagebutton because this text is in the middle of a string and so I can't split it. Any ideas? Thanks! EDIT: I found a solution or rather found how to do it correctly. All I had to do was adding the setMovementMethod call before the call to setText in the TextView and ALSO, and COMPLETELY NECESSARY, remove the attribute "android:autoLink="all" from the layout. Apparently, parsing mails and urls in a string is mutually exclusive to interpreting the link tags in a string. So one or the other but not both. Finally my layout is just a TextView with nothing special, just width and height. The activity is like this: TextView tv = (TextView)findViewById(R.id.about_text); tv.setMovementMethod(LinkMovementMethod.getInstance()); tv.setText(Html.fromHtml(getString(R.string.about_content), new ImageGetter(), null)); And the string is like this: <string name="about_content"><a href="mailto:[email protected]"><img src="mail" /></a></string>

    Read the article

  • Displaying an Image in an activity using URI

    - by evkwan
    Hi, I'm writing an application that uses Intent(MediaStore.ACTION_IMAGE_CAPTURE) to capture and image. On the process of capturing the image, I noted the output of the image's URI. Right after finishing the camera activity, I wish to display the image using this specific URI. The method I used to capture images is: private void saveFullImage() { Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE); File file = new File(Environment.getExternalStorageDirectory(), "test.jpg"); outputFileUri = Uri.fromFile(file); intent.putExtra(MediaStore.EXTRA_OUTPUT, outputFileUri); startActivityForResult(intent, TAKE_PICTURE); } Which is a method taken from Reto Meier's book Professional Android 2 Application Development. The method works fine, and I assume that the URI of the picture I just took is stored in the outputFileUri variable. Then at this point of the code is where I want to display the picture: @Override protected void onActivityResult(int requestCode, int resultCode, Intent data) { if (requestCode == TAKE_PICTURE) { //I want to display the picture I just took here //using the URI } } I'm not sure how to do it. I tried creating a new layout object and a new ImageView object using the method setImageURI(outputFileUri). My main layout (xml) did not have a ImageView object. But even when I set the contentView to the new layout with the ImageView attached to it, it doesn't display anything. I tried creating a Bitmap object from the URI and set it to the ImageView, but I get an unexpected error and forced exit. I have seen examples from here here which creates a Bitmap from URI, but it's not displaying it? My question is just how to display an image in the middle of a running activity? Do I need to get the File Path (like this) in order to display it? If I make a Bitmap out of the URI, how do I display the Bitmap? I'm just probably missing something simple...so any help would be a greatly appreciated! Also additional question for thought: If I were to take multiple pictures, would you recommend me to use the SimpleCursorAdapter instead? Thanks!

    Read the article

  • OpenGL, how to set a monocrome texture to a colored shape?

    - by Santiago
    I'm developing on Android with OpenGL ES, I draw some cubes and I change its colors with glColor4f. Now, what I want is to give a more realistic effect on the cubes, so I create a monochromatic 8bit depth, 64x64 pixel size PNG file. I loaded on a texture, and here is my problem, witch is the way to combine the color and the texture to get a colorized and textured cubes onto the screen? I'm not an expert on OpenGL, I tried this: On create: public void asignBitmap(GL10 gl, Bitmap bitmap) { int[] textures = new int[1]; gl.glGenTextures(1, textures, 0); mTexture = textures[0]; gl.glBindTexture(GL10.GL_TEXTURE_2D, mTexture); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE); gl.glTexEnvf(GL10.GL_TEXTURE_ENV, GL10.GL_TEXTURE_ENV_MODE, GL10.GL_REPLACE); GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_ALPHA, bitmap, 0); ByteBuffer tbb = ByteBuffer.allocateDirect(texCoords.length * 4); tbb.order(ByteOrder.nativeOrder()); mTexBuffer = tbb.asFloatBuffer(); for (int i = 0; i < 48; i++) mTexBuffer.put(texCoords[i]); mTexBuffer.position(0); } And OnDraw: public void draw(GL10 gl, int alphawires) { gl.glColor4f(1.0f, 0.0f, 0.0f, 0.5f); //RED gl.glBindTexture(GL10.GL_TEXTURE_2D, mTexture); gl.glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE_MINUS_SRC_ALPHA); gl.glEnable(GL10.GL_TEXTURE_2D); gl.glEnable(GL10.GL_BLEND); gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY); gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, mTexBuffer); //Set the face rotation gl.glFrontFace(GL10.GL_CW); //Point to our buffers gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer); //Enable the vertex and color state gl.glEnableClientState(GL10.GL_VERTEX_ARRAY); //Draw the vertices as triangles, based on the Index Buffer information gl.glDrawElements(GL10.GL_TRIANGLES, 36, GL10.GL_UNSIGNED_BYTE, indexBuffer); //Disable the client state before leaving gl.glDisableClientState(GL10.GL_VERTEX_ARRAY); gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY); gl.glDisable(GL10.GL_BLEND); gl.glDisable(GL10.GL_TEXTURE_2D); } I'm even not sure if I have to use a blend option, because I don't need transparency, but is a plus :) Thank's

    Read the article

  • How to populate GridView if Internet not available but images already cached to SD Card

    - by Sophie
    Hello I am writing an Application in which i am parsing JSON Images and then caching into SD Card. What I want to do ? I want to load images into GridView from JSON (by caching images into SD Card), and wanna populate GridView (no matter Internet available or not) once images already downloaded into SD Card. What I am getting ? I am able to cache images into SD Card, also to populate GridView, but not able to show images into GridView (if Internet not available) but images cached into SD Card @Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) { myGridView = inflater.inflate(R.layout.fragment_church_grid, container, false); if (isNetworkAvailable()) { new DownloadJSON().execute(); } else { Toast.makeText(getActivity(), "Internet not available !", Toast.LENGTH_LONG).show(); } return myGridView ; } private boolean isNetworkAvailable() { ConnectivityManager cm = (ConnectivityManager) getActivity().getSystemService(Context.CONNECTIVITY_SERVICE); NetworkInfo info = cm.getActiveNetworkInfo(); return (info != null); } // DownloadJSON AsyncTask private class DownloadJSON extends AsyncTask<Void, Void, Void> { @Override protected void onPreExecute() { super.onPreExecute(); // Create a progressdialog mProgressDialog = new ProgressDialog(getActivity()); // Set progressdialog title mProgressDialog.setTitle("Church Application"); // Set progressdialog message mProgressDialog.setMessage("Loading Images..."); mProgressDialog.setIndeterminate(false); // Show progressdialog mProgressDialog.show(); } @Override protected Void doInBackground(Void... params) { // Create an array arraylist = new ArrayList<HashMap<String, String>>(); // Retrieve JSON Objects from the given URL address jsonobject = JSONfunctions .getJSONfromURL("http://snapoodle.com/APIS/android/feed.php"); try { // Locate the array name in JSON jsonarray = jsonobject.getJSONArray("print"); for (int i = 0; i < jsonarray.length(); i++) { HashMap<String, String> map = new HashMap<String, String>(); jsonobject = jsonarray.getJSONObject(i); // Retrive JSON Objects map.put("saved_location", jsonobject.getString("saved_location")); // Set the JSON Objects into the array arraylist.add(map); } } catch (JSONException e) { Log.e("Error", e.getMessage()); e.printStackTrace(); } return null; } @Override protected void onPostExecute(Void args) { // Locate the listview in listview_main.xml listview = (GridView) shriRamView.findViewById(R.id.listview); // Pass the results into ListViewAdapter.java adapter = new ChurchImagesAdapter(getActivity(), arraylist); // Set the adapter to the ListView listview.setAdapter(adapter); // Close the progressdialog mProgressDialog.dismiss(); } } }

    Read the article

  • Drag N Drop utilizing simple cursor

    - by Cameron
    I'm using CommonsGuy's drag n drop example and I am basically trying to integrate it with the Android notepad example. Drag N Drop Out of the 2 different drag n drop examples i've seen they have all used a static string array where as i'm getting a list from a database and using simple cursor adapter. So my question is how to get the results from simple cursor adapter into a string array, but still have it return the row id when the list item is clicked so I can pass it to the new activity that edits the note. Here is my code: Cursor notesCursor = mDbHelper.fetchAllNotes(); startManagingCursor(notesCursor); // Create an array to specify the fields we want to display in the list (only NAME) String[] from = new String[]{WeightsDatabase.KEY_NAME}; // and an array of the fields we want to bind those fields to (in this case just text1) int[] to = new int[]{R.id.weightrows}; // Now create a simple cursor adapter and set it to display SimpleCursorAdapter notes = new SimpleCursorAdapter(this, R.layout.weights_row, notesCursor, from, to); setListAdapter(notes); And here is the code i'm trying to work that into. public class TouchListViewDemo extends ListActivity { private static String[] items={"lorem", "ipsum", "dolor", "sit", "amet", "consectetuer", "adipiscing", "elit", "morbi", "vel", "ligula", "vitae", "arcu", "aliquet", "mollis", "etiam", "vel", "erat", "placerat", "ante", "porttitor", "sodales", "pellentesque", "augue", "purus"}; private IconicAdapter adapter=null; private ArrayList<String> array=new ArrayList<String>(Arrays.asList(items)); @Override public void onCreate(Bundle icicle) { super.onCreate(icicle); setContentView(R.layout.main); adapter=new IconicAdapter(); setListAdapter(adapter); TouchListView tlv=(TouchListView)getListView(); tlv.setDropListener(onDrop); tlv.setRemoveListener(onRemove); } private TouchListView.DropListener onDrop=new TouchListView.DropListener() { @Override public void drop(int from, int to) { String item=adapter.getItem(from); adapter.remove(item); adapter.insert(item, to); } }; private TouchListView.RemoveListener onRemove=new TouchListView.RemoveListener() { @Override public void remove(int which) { adapter.remove(adapter.getItem(which)); } }; class IconicAdapter extends ArrayAdapter<String> { IconicAdapter() { super(TouchListViewDemo.this, R.layout.row2, array); } public View getView(int position, View convertView, ViewGroup parent) { View row=convertView; if (row==null) { LayoutInflater inflater=getLayoutInflater(); row=inflater.inflate(R.layout.row2, parent, false); } TextView label=(TextView)row.findViewById(R.id.label); label.setText(array.get(position)); return(row); } } } I know i'm asking for a lot, but a point in the right direction would help quite a bit! Thanks

    Read the article

  • Ping remote server and wait to get data

    - by infinity
    Hi I'm building my first application for android and I've reached a point where I can't find a solution even have no idea what to search for in Google. So the problem: I am pinging a remote server with GET request through the application passing some parameters like file_id. Then the server gives back confirmation if the file exists or error otherwise, both in plain text. The error string is $$$ERROR$$$. Actually the confirmation is JSON string that holds the path to the file. If the file doesn't exists on the server it generated the error message and start downloading the file and processing it which normally takes 10-30 seconds. What would be the best way to check if the file is ready for download? I have DownloadFile class that extends AsyncTask but before I reach the point to download the file I need the URL which is dependant on the previous request which is in the main class in the UI thread. Here is some code: public class MainActivity extends Activity { private String getInfo() { // Create a new HttpClient and Post Header HttpClient httpClient = new DefaultHttpClient(); HttpGet httpPost = new HttpGet(infoUrl); StringBuilder sb = null; String data; JSONObject jObject = null; try { HttpResponse response = httpClient.execute(httpPost); // This might be equal "$$$ERROR$$$" if no file exists sb = inputStreamToString(response.getEntity().getContent()); } catch(ClientProtocolException e) { // TODO Auto-generated catch block Log.v("Error: pushItem ClientProtocolException: ", e.toString()); } catch (IOException e) { // TODO Auto-generated catch block Log.v("Error: pushItem IOException: ", e.toString()); } // Clean the data to be complaint JSON format data = sb.toString().replace("info = ", ""); try { jObject = new JSONObject(data); data = jObject.getString("h"); fileTitle = jObject.getString("title"); } catch (JSONException e) { // TODO Auto-generated catch block e.printStackTrace(); } downloadUrl = String.format(downloadUrl, fileId, data); return downloadUrl; } } So my idea was to get the content and if equal to $$$ERROR$$$ go into loop until JSON data is passed but I guess there is better solution. Note: I don't have control over the server output so have to deal with what I have.

    Read the article

  • Unlock device, display a text, then lock again

    - by Waza_Be
    For the need of my application, I need to display a message on the screen even if the lockscreen is enabled, then wait 3 seconds, than I have to lock again the phone as I don't want it to make unwanted phone calls in your pockets. First part is easy: if (PreferenceManager.getDefaultSharedPreferences( getBaseContext()).getBoolean("wake", false)) { KeyguardManager kgm = (KeyguardManager) getSystemService(Context.KEYGUARD_SERVICE); boolean isKeyguardUp = kgm.inKeyguardRestrictedInputMode(); WakeLocker.acquire(ProtoBenService.this); Intent myIntent = new Intent(ProtoBenService.this,LockActivity.class); myIntent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK); if (isKeyguardUp) { ProtoBenService.this.startActivity(myIntent); } else Toast.makeText(ProtoBenService.this.getBaseContext(), intention, Toast.LENGTH_LONG).show(); WakeLocker.release(); } With this class: public abstract class WakeLocker { private static PowerManager.WakeLock wakeLock; public static void acquire(Context ctx) { if (wakeLock != null) wakeLock.release(); PowerManager pm = (PowerManager) ctx.getSystemService(Context.POWER_SERVICE); wakeLock = pm.newWakeLock(PowerManager.FULL_WAKE_LOCK | PowerManager.ACQUIRE_CAUSES_WAKEUP | PowerManager.ON_AFTER_RELEASE, "CobeIm"); wakeLock.acquire(); } public static void release() { if (wakeLock != null) wakeLock.release(); wakeLock = null; } } And the Activity: public class LockActivity extends Activity { @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); Window window = getWindow(); window.addFlags(WindowManager.LayoutParams.FLAG_DISMISS_KEYGUARD); window.addFlags(WindowManager.LayoutParams.FLAG_TURN_SCREEN_ON); window.addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON); TextView tv = new TextView(this); tv.setText("This is working!"); tv.setTextSize(45); setContentView(tv); Runnable mRunnable; Handler mHandler = new Handler(); mRunnable = new Runnable() { @Override public void run() { LockActivity.this.finish(); } }; mHandler.postDelayed(mRunnable, 3 * 1000); } } So, this is nice, the phone can display my text! The only problem comes when I want to lock again the phone, it seems that locking the phone is protected by the system... Programmatically turning off the screen and locking the phone how to lock the android programatically I think that my users won't understand the Device Admin and won't be able to activate it. Is there any workaround to lock the screen without the Device Admin stuff?

    Read the article

  • OutOfMemoryError loading Bitmap via DefaultHttpClient

    - by Goddchen
    i have a simple problem: Although i'm using sampleSize properly, my code doesn't even reach the BitmapFactorycode, since DefaultHttpClient is already throwing the exception. Here is my code: DefaultHttpClient client = new DefaultHttpClient(); HttpGet request = new HttpGet(mSongInfo.imageLarge); HttpResponse response = client.execute(request); int sampleSize = 1; while (response.getEntity().getContentLength() / sampleSize / sampleSize > 100 * 1024) { sampleSize *= 2; } BitmapFactory.Options options = new BitmapFactory.Options(); options.inSampleSize = sampleSize; final Bitmap bitmap = BitmapFactory.decodeStream(response .getEntity().getContent(), null, options); And here is the exception: 0 java.lang.OutOfMemoryError: (Heap Size=11463KB, Allocated=7623KB, Bitmap Size=9382KB) 1 at org.apache.http.util.ByteArrayBuffer.<init>(ByteArrayBuffer.java:53) 2 at org.apache.http.impl.io.AbstractSessionInputBuffer.init(AbstractSessionInputBuffer.java:82) 3 at org.apache.http.impl.io.SocketInputBuffer.<init>(SocketInputBuffer.java:98) 4 at org.apache.http.impl.SocketHttpClientConnection.createSessionInputBuffer(SocketHttpClientConnection.java:83) 5 at org.apache.http.impl.conn.DefaultClientConnection.createSessionInputBuffer(DefaultClientConnection.java:170) 6 at org.apache.http.impl.SocketHttpClientConnection.bind(SocketHttpClientConnection.java:106) 7 at org.apache.http.impl.conn.DefaultClientConnection.openCompleted(DefaultClientConnection.java:129) 8 at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:173) 9 at org.apache.http.impl.conn.AbstractPoolEntry.open(AbstractPoolEntry.java:164) 10 at org.apache.http.impl.conn.AbstractPooledConnAdapter.open(AbstractPooledConnAdapter.java:119) 11 at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:359) 12 at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:555) 13 at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:487) 14 at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:465) 15 at de.goddchen.android.easysongfinder.fragments.SongFragment$1.run(SongFragment.java:79) 16 at java.lang.Thread.run(Thread.java:1027) As you can see, the code doesn't even reach the part where i check the size (Content-Length) of the image and calculate a proper sample size. I wasn't aware that simply calling DefaultHttpClient.execute(...) will already load the complete content into the memory. Am i doing something wrong? What is the right way to first retrieve the content length and then start reading the content from an InputStream? EDIT To avoid common answers that show how to load images from a URL: i already know how to do that, i have also posted the code above, so why do you keep referencing tutorials on that? I explicitly was very clear about the problem: Why is HttpClient.execute(...)already fetching the whole content and storing it in memory instead of providing a proper ÌnputStreamto me? Please don't post any beginner tutorials on how to load aBitmap`from a URL...

    Read the article

  • Adding Information in SQLite

    - by Cam
    Hi All, I am having trouble with my Android App when adding information into SQLite. I am relatively new to Java/SQLite and though I have followed a lot of tutorials on SQLite and have been able to get the example code to run I am unable to get tables to be created and data to import when running my own app. I have included my code in two Java files Questions (Main Program) and QuestionData (helper class represents the database). Questions.java: public class Questions extends Activity { private QuestionData questions; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.quiztest); questions = new QuestionData(this); try { Cursor cursor = getQuestions(); showQuestions(cursor); } finally { questions.close(); } } private Cursor getQuestions() { //Select Query String loadQuestions = "SELECT * FROM questionlist"; SQLiteDatabase db = questions.getReadableDatabase(); Cursor cursor = db.rawQuery(loadQuestions, null); startManagingCursor(cursor); return cursor; } private void showQuestions(Cursor cursor) { // Collect String Values from Query and Display them this part of the code is wokring fine when there is data present. QuestionData.java public class QuestionData extends SQLiteOpenHelper { private static final String DATABASE_NAME = "TriviaQuiz.db" ; private static final int DATABASE_VERSION = 2; public QuestionData(Context ctx) { super(ctx, DATABASE_NAME, null, DATABASE_VERSION); } @Override public void onCreate(SQLiteDatabase db) { db.execSQL("CREATE TABLE questionlist (_id INTEGER PRIMARY KEY AUTOINCREMENT, QID TEXT, QQuestion TEXT, QAnswer TEXT, QOption1 TEXT, QOption2 TEXT, QOption3 TEXT, QCategoryTagLvl1 TEXT, QCategoryTagLvl2 TEXT, QOptionalTag1 TEXT, QOptionalTag2 TEXT, QOptionalTag3 TEXT, QOptionalTag4 TEXT, QOptionalTag5 TEXT, QTimePeriod TEXT, QDifficultyRating TEXT, QGenderBias TEXT, QAgeBias TEXT, QRegion TEXT, QWikiLink TEXT, QValidationLink1 TEXT, QValidationLink2 TEXT, QHint TEXT, QLastValidation TEXT, QNotes TEXT, QMultimediaType TEXT, QMultimediaLink TEXT, QLastAsked TEXT);"); db.execSQL("INSERT INTO questionlist (_id, QID, QQuestion, QAnswer, QOption1, QOption2, QOption3, QCategoryTagLvl1, QCategoryTagLvl2, QOptionalTag1, QOptionalTag2, QOptionalTag3, QOptionalTag4, QOptionalTag5, QTimePeriod, QDifficultyRating, QGenderBias, QAgeBias, QRegion, QWikiLink, QValidationLink1, QValidationLink2, QHint, QLastValidation, QNotes, QMultimediaType, QMultimediaLink, QLastAsked)"+ "VALUES (null,'Q00001','Example','Ans1','Q1','Q2','Q3','Q4','','','','','','','','','','','','','','','','','','','','')"); } @Override public void onUpgrade(SQLiteDatabase db, int oldVersion, int newVersion) { db.execSQL("DROP TABLE IF EXISTS " + TABLE_NAME); onCreate(db); } } Any suggestions at all would be great. I have tried debugging which suggests that the database does not exist. Thanks in advance for your assistance.

    Read the article

  • File sorting, Comparison method violates its general contract

    - by user2677383
    My file sorting comparator sometimes invoke java.lang.IllegalArgumentException in android app. I cannot figure out the reason. could you explain that? here is my code block: Comparator cc = new Comparator<File>() { @Override public int compare(File f1, File f2) { if (f1.isDirectory() && !f2.isDirectory()) { return 1; } if (f2.isDirectory() && !f1.isDirectory()) { return -1; } if ((f1.isFile() && f2.isFile()) || (f1.isDirectory() && f2.isDirectory())) { if (sort == 1) { return Util.compareFileName(f2.getName(), f1.getName()); } else if (sort == 2) { return (int) (f1.lastModified() - f2.lastModified()); } else if (sort == 3) { return (int) (f2.lastModified() - f1.lastModified()); } else if (sort == 4) { return (int) (f1.length() - f2.length()); } else if (sort == 5) { return (int) (f2.length() - f1.length()); } else return Util.compareFileName(f1.getName(), f2.getName()); } return f1.compareTo(f2); } }; Util.compareFileName is as followings: public static int compareFileName(String s1, String s2) { int thisMarker = 0; int thatMarker = 0; int s1Length = s1.length(); int s2Length = s2.length(); while (thisMarker < s1Length && thatMarker < s2Length) { String thisChunk = getChunk(s1, s1Length, thisMarker); thisMarker += thisChunk.length(); String thatChunk = getChunk(s2, s2Length, thatMarker); thatMarker += thatChunk.length(); // If both chunks contain numeric characters, sort them numerically int result = 0; if (isDigit(thisChunk.charAt(0)) && isDigit(thatChunk.charAt(0))) { // Simple chunk comparison by length. int thisChunkLength = thisChunk.length(); result = thisChunkLength - thatChunk.length(); // If equal, the first different number counts if (result == 0) { for (int i = 0; i < thisChunkLength; i++) { result = thisChunk.charAt(i) - thatChunk.charAt(i); if (result != 0) { return result; } } } } else { result = thisChunk.compareTo(thatChunk); } if (result != 0) return result; } return s1Length - s2Length; }

    Read the article

  • AudioTrack lag: obtainBuffer timed out

    - by BTR
    I'm playing WAVs on my Android phone by loading the file and feeding the bytes into AudioTrack.write() via the FileInputStream BufferedInputStream DataInputStream method. The audio plays fine and when it is, I can easily adjust sample rate, volume, etc on the fly with nice performance. However, it's taking about two full seconds for a track to start playing. I know AudioTrack has an inescapable delay, but this is ridiculous. Every time I play a track, I get this: 03-13 14:55:57.100: WARN/AudioTrack(3454): obtainBuffer timed out (is the CPU pegged?) 0x2e9348 user=00000960, server=00000000 03-13 14:55:57.340: WARN/AudioFlinger(72): write blocked for 233 msecs, 9 delayed writes, thread 0xba28 I've noticed that the delayed write count increases by one every time I play a track -- even across multiple sessions -- from the time the phone has been turned on. The block time is always 230 - 240ms, which makes sense considering a minimum buffer size of 9600 on this device (9600 / 44100). I've seen this message in countless searches on the Internet, but it usually seems to be related to not playing audio at all or skipping audio. In my case, it's just a delayed start. I'm running all my code in a high priority thread. Here's a truncated-yet-functional version of what I'm doing. This is the thread callback in my playback class. Again, this works (only playing 16-bit, 44.1kHz, stereo files right now), it just takes forever to start and has that obtainBuffer/delayed write message every time. public void run() { // Load file FileInputStream mFileInputStream; try { // mFile is instance of custom file class -- this is correct, // so don't sweat this line mFileInputStream = new FileInputStream(mFile.path()); } catch (FileNotFoundException e) {} BufferedInputStream mBufferedInputStream = new BufferedInputStream(mFileInputStream, mBufferLength); DataInputStream mDataInputStream = new DataInputStream(mBufferedInputStream); // Skip header try { if (mDataInputStream.available() > 44) mDataInputStream.skipBytes(44); } catch (IOException e) {} // Initialize device mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, ConfigManager.SAMPLE_RATE, AudioFormat.CHANNEL_CONFIGURATION_STEREO, AudioFormat.ENCODING_PCM_16BIT, ConfigManager.AUDIO_BUFFER_LENGTH, AudioTrack.MODE_STREAM); mAudioTrack.play(); // Initialize buffer byte[] mByteArray = new byte[mBufferLength]; int mBytesToWrite = 0; int mBytesWritten = 0; // Loop to keep thread running while (mRun) { // This flag is turned on when the user presses "play" while (mPlaying) { try { // Check if data is available if (mDataInputStream.available() > 0) { // Read data from file and write to audio device mBytesToWrite = mDataInputStream.read(mByteArray, 0, mBufferLength); mBytesWritten += mAudioTrack.write(mByteArray, 0, mBytesToWrite); } } catch (IOException e) { } } } } If I can get past the artificially long lag, I can easily deal with the inherit latency by starting my write at a later, predictable position (ie, skip past the minimum buffer length when I start playing a file).

    Read the article

  • How much Java should I have learnt before trying Android programming?

    - by Sidney Yin
    Hi - I have been seeking beginner learning books in Android, and of course found out that I should learn Java first. So I began studying Java and now I am quite comfortable with objects, classes, inheritance, interfaces, and just moved onto Layouts in Swing as well as Swing Features. But I am starting to wonder.... do I know enough about Java now? Can I start programming Android yet? Of course I can keep going in Java, but have been itching to begin programming Android apps. Any definitive answer here about how much Java I need to know before Android? Thanks so much!

    Read the article

  • Java ME SDK 3.2 is now live

    - by SungmoonCho
    Hi everyone, It has been a while since we released the last version. We have been very busy integrating new features and making lots of usability improvements into this new version. Datasheet is available here. Please visit Java ME SDK 3.2 download page to get the latest and best version yet! Some of the new features in this version are described below. Embedded Application SupportOracle Java ME SDK 3.2 now supports the new Oracle® Java ME Embedded. This includes support for JSR 228, the Information Module Profile-Next Generation API (IMP-NG). You can test and debug applications either on the built-in device emulators or on your device. Memory MonitorThe Memory Monitor shows memory use as an application runs. It displays a dynamic detailed listing of the memory usage per object in table form, and a graphical representation of the memory use over time. Eclipse IDE supportOracle Java ME SDK 3.2 now officially supports Eclipse IDE. Once you install the Java ME SDK plugins on Eclipse, you can start developing, debugging, and profiling your mobile or embedded application. Skin CreatorWith the Custom Device Skin Creator, you can create your own skins. The appearance of the custom skins is generic, but the functionality can be tailored to your own specifications.  Here are the release highlights. Implementation and support for the new Oracle® Java Wireless Client 3.2 runtime and the Oracle® Java ME Embedded runtime. The AMS in the CLDC emulators has a new look and new functionality (Install Application, Manage Certificate Authorities and Output Console). Support for JSR 228, the Information Module Profile-Next Generation API (IMP-NG). The IMP-NG platform is implemented as a subset of CLDC. Support includes: A new emulator for headless devices. Javadocs for the following Oracle APIs: Device Access API, Logging API, AMS API, and AccessPoint API. New demos for IMP-NG features can be run on the emulator or on a real device running the Oracle® Java ME Embedded runtime. New Custom Device Skin Creator. This tool provides a way to create and manage custom emulator skins. The skin appearance is generic, but the functionality, such as the JSRs supported or the device properties, are up to you. This utility only supported in NetBeans. Eclipse plugin for CLDC/MIDP. For the first time Oracle Java ME SDK is available as an Eclipse plugin. The Eclipse version does not support CDC, the Memory Monitor, and the Custom Device Skin Creator in this release. All Java ME tools are implemented as NetBeans plugins. As of the plugin integrates Java ME utilities into the standard NetBeans menus. Tools > Java ME menu is the place to launch Java ME utilities, including the new Skin Creator. Profile > Java ME is the place to work with the Network Monitor and the Memory Monitor. Use the standard NetBeans tools for debugging. Profiling, Network monitoring, and Memory monitoring are integrated with the NetBeans profiling tools. New network monitoring protocols are supported in this release: WMA, SIP, Bluetooth and OBEX, SATSA APDU and JCRMI, and server sockets. Java ME SDK Update Center. Oracle Java ME SDK can be updated or extended by new components. The Update Center can download, install, and uninstall plugins specific to the Java ME SDK. A plugin consists of runtime components and skins. Bug fixes and enhancements. This version comes with a few known problems. All of them have workarounds, so I hope you don't get stuck in these issues when you are using the product. It you cannot watch static variables during an Eclipse debugging session, and sometimes the Variable view cannot show data. In the source code, move the mouse over the required variable to inspect the variable value. A real device shown in the Device Selector is deleted from the Device Manager, yet it still appears. Kill the device manager in the system tray, and relaunch it. Then you will see the device removed from the list. On-device profiling does not work on a device. CPU profiling, networking monitoring, and memory monitoring do not work on the device, since the device runtime does not yet support it. Please do the profiling with your emulator first, and then test your application on the device. In the Device Selector, using Clean Database on real external device causes a null pointer exception. External devices do not have a database recognized by the SDK, so you can disregard this exception message. Suspending the Emulator during a Memory Monitor session hangs the emulator. Do not use the Suspend option (F5) while the Memory Monitor is running. If the emulator is hung, open the Windows task manager and stop the emulator process (javaw). To switch to another application while the Memory Monitor is running, choose Application > AMS Home (F4), and select a different application. Please let us know how we can improve it even better, by sending us your feedback. -Java ME SDK Team

    Read the article

  • Running Windows Phone Developers Tools CTP under VMWare Player - Yes you can! - But do you want to?

    - by Liam Westley
    This blog is the result of a quick investigation of running the Windows Phone Developer Tools CTP under VMWare Player.  In the release notes for Windows Phone Developer Tools CTP it mentions that it is not supported under VirtualPC or Hyper-V.  Some developers have policies where ‘no non-production code’ can be installed on their development workstation and so the only way they can use a CTP like this is in a virtual machine. The dilemma here is that the emulator for Windows Phone itself is a virtual machine and running a virtual machine within another virtual machine is normally frowned upon.  Even worse, previous Windows Mobile emulators detected they were in a virtual machine and refused to run.  Why VMWare? I selected VMWare as a possible solution as it is possible to run VMWare ESXi under VMWare Workstation by manually setting configuration options in the VMX configuration file so that it does not detect the presence of a virtual environment. I actually found that I could use VMWare Player (the free version, that can now create VM images) and that there was no need for any editing of the configuration file (I tried various switches, none of which made any difference to performance). So you can run the CTP under VMWare Player, that’s the good news. The bad news is that it is incredibly slow, bordering on unusable.  However, if it’s the only way you can use the CTP, at least this is an option. VMWare Player configuration I used the latest VMWare Player, 3.0, running under Windows x64 on my HP 6910p laptop with an Intel T7500 Dual Core CPU running at 2.2GHz, 4Gb of memory and using a separate drive for the virtual machines. I created a machine in VMWare Player with a single CPU, 1536 Mb memory and installed Windows 7 x64 from an ISO image.  I then performed a Windows Update, installed VMWare Tools, and finally the Windows Phone Developer Tools CTP After a few warnings about performance, I configured Windows 7 to run with Windows 7 Basic theme rather than use Aero (which is available under VMWare Player as it has a WDDM driver). Timings As a test I first launched Microsoft Visual Studio 2010 Express for Windows Phone, and created a default Windows Phone Application project.  I then clicked the run button, which starts the emulator and then loads the default application onto the emulator. For the second test I left the emulator running, stopped the default application, added a single button to change the page title and redeployed to the already running emulator by clicking the run button.   Test 1 (1st run) Test 2 (emulator already running)   VMWare Player 10 minutes  1 minute   Windows x64 native 1 minute  < 10 seconds   Conclusion You can run the Windows Phone Developer Tools CTP under VMWare Player, but it’s really, really slow and you would have to have very good reasons to try this approach. If you need to keep a development system free of non production code, and the two systems aren’t required to run simultaneously, then I’d consider a boot from VHD option.  Then you can completely isolate the Windows Phone Developer Tools CTP and development environment into a single VHD separate from your main development system.

    Read the article

  • Dunod dévoile sa collection « Cookbook », découvrez les recettes de pros pour le Référencement Google et le Développement Android 4

    Dunod dévoile sa collection « Cookbook » découvrez les recettes de pros pour le Référencement Google et le Développement Android 4« L'informatique, c'est parfois un peu comme la cuisine : il faut assembler un certain nombre d'ingrédients et d'actions selon un enchaînement très précis. », introduit Dunod pour présenter sa nouvelle collection Cookbook.Le principe de la collection Cookbook est de rassembler dans un seul ouvrage un certain nombre de « recettes » qui fournissent des réponses concrètes...

    Read the article

  • Google I/O 2011: HTML5 versus Android: Apps or Web for Mobile Development?

    Google I/O 2011: HTML5 versus Android: Apps or Web for Mobile Development? Reto Meier, Michael Mahemoff Native apps or mobile web? It's often a hard choice when deciding where to invest your mobile development resources. While the mobile web continues to grow, native apps and App Stores are incredibly popular. We will present both perspectives in an app development smackdown. From: GoogleDevelopers Views: 13367 73 ratings Time: 01:01:35 More in Science & Technology

    Read the article

  • « Android, OS mobile le plus insécurisé qui soit » : à qui la faute ? Une simple migration vers Jelly Bean suffirait à résoudre plusieurs problèmes

    « Android, OS mobile le plus insécurisé qui soit ». À qui la faute ? Une simple migration vers Jelly Bean 4.2 suffirait à résoudre la majeure partie des problèmes que rencontrent les utilisateurs de l'OS mobileGoogle domine l'écosystème mobile avec son OS mobile. Malheureusement, ce dernier constitue une cible de choix pour les pirates informatiques qui en ont fait leur terrain de jeux privilégié. Un récent rapport des acteurs dans le domaine de la sécurité des mobiles (dont un article entier a été consacré sur DVP), fait état d'une augmenta...

    Read the article

  • ifconfig : Command 'ifconfig' is available in '/sbin/ifconfig'

    - by Sahil Grover
    My question is related to another open question. My echo $PATH gives me an output which is like /home/sahil/.rvm/gems/ruby-1.9.3-p125/bin:/home/sahil/.rvm/gems/ruby-1.9.3-p125@global/bin:/home/sahil/.rvm/rubies/ruby-1.9.3-p125/bin:/home/sahil/.rvm/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/home/sahil/.rvm/bin{}:/home/android-sdks/{}:/home/android-sdks/platform-tools/{}:/home/android-sdks/tools/{}:/home/sahil/android-sdks/tools{}:/home/sahil/android-sdks/tools:/home/sahil/android-sdks/platform-tools/ But running ifconfig gives me an output like Command 'ifconfig' is available in '/sbin/ifconfig' The command could not be located because '/sbin' is not included in the PATH environment variable. This is most likely caused by the lack of administrative privileges associated with your user account. ifconfig: command not found after running command like given in other question export PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" it runs ifconfig but blocks other commands of ruby rails or rvm. Seeking help how to resolve this. Also why this happens?

    Read the article

  • Android viole-t-il la licence du noyau Linux ? Des experts évoquent un problème encore plus complexe que le conflit avec Oracle sur Java

    Android viole-t-il la licence du noyau Linux ? Des experts avertissent contre un problème qui pourrait être encore plus complexe que le conflit avec Oracle sur Java Comme si le procès intenté par Oracle ne suffisait pas, des experts en propriété intellectuelle se sont penchés sur l'utilisation par l'OS mobile de Google de la base de code sous licence GPL Version 2 du noyau Linux. Ils mettent en garde contre une problématique encore plus complexe pour Google que le bras de fer qui l'oppose à Or...

    Read the article

< Previous Page | 333 334 335 336 337 338 339 340 341 342 343 344  | Next Page >