Search Results

Search found 16950 results on 678 pages for 'android graphics'.

Page 54/678 | < Previous Page | 50 51 52 53 54 55 56 57 58 59 60 61  | Next Page >

  • How To Use USB Drives With the Nexus 7 and Other Android Devices

    - by Chris Hoffman
    The Nexus 7 may not have a lot of storage space – especially the original 8 GB model – but you can connect a USB drive to it if you want to watch videos or access other files. Unfortunately, Android doesn’t automatically mount USB drives by default. You’ll need to root your device to enable support for USB drives. Why Does 64-Bit Windows Need a Separate “Program Files (x86)” Folder? Why Your Android Phone Isn’t Getting Operating System Updates and What You Can Do About It How To Delete, Move, or Rename Locked Files in Windows

    Read the article

  • ORE graphics using Remote Desktop Protocol

    - by Sherry LaMonica
    Oracle R Enterprise graphics are returned as raster, or bitmap graphics. Raster images consist of tiny squares of color information referred to as pixels that form points of color to create a complete image. Plots that contain raster images render quickly in R and create small, high-quality exported image files in a wide variety of formats. However, it is a known issue that the rendering of raster images can be problematic when creating graphics using a Remote Desktop connection. Raster images do not display in the windows device using Remote Desktop under the default settings. This happens because Remote Desktop restricts the number of colors when connecting to a Windows machine to 16 bits per pixel, and interpolating raster graphics requires many colors, at least 32 bits per pixel.. For example, this simple embedded R image plot will be returned in a raster-based format using a standalone Windows machine:  R> library(ORE) R> ore.connect(user="rquser", sid="orcl", host="localhost", password="rquser", all=TRUE)  R> ore.doEval(function() image(volcano, col=terrain.colors(30))) Here, we first load the ORE packages and connect to the database instance using database login credentials. The ore.doEval function executes the R code within the database embedded R engine and returns the image back to the client R session. Over a Remote Desktop connection under the default settings, this graph will appear blank due to the restricted number of colors. Users who encounter this issue have two options to display ORE graphics over Remote Desktop: either raise Remote Desktop's Color Depth or direct the plot output to an alternate device. Option #1: Raise Remote Desktop Color Depth setting In a Remote Desktop session, all environment variables, including display variables determining Color Depth, are determined by the RCP-Tcp connection settings. For example, users can reduce the Color Depth when connecting over a slow connection. The different settings are 15 bits, 16 bits, 24 bits, or 32 bits per pixel. To raise the Remote Desktop color depth: On the Windows server, launch Remote Desktop Session Host Configuration from the Accessories menu.Under Connections, right click on RDP-Tcp and select Properties.On the Client Settings tab either uncheck LimitMaximum Color Depth or set it to 32 bits per pixel. Click Apply, then OK, log out of the remote session and reconnect.After reconnecting, the Color Depth on the Display tab will be set to 32 bits per pixel.  Raster graphics will now display as expected. For ORE users, the increased color depth results in slightly reduced performance during plot creation, but the graph will be created instead of displaying an empty plot. Option #2: Direct plot output to alternate device Plotting to a non-windows device is a good option if it's not possible to increase Remote Desktop Color Depth, or if performance is degraded when creating the graph. Several device drivers are available for off-screen graphics in R, such as postscript, pdf, and png. On-screen devices include windows, X11 and Cairo. Here we output to the Cairo device to render an on-screen raster graphic.  The grid.raster function in the grid package is analogous to other grid graphical primitives - it draws a raster image within the current plot's grid.  R> options(device = "CairoWin") # use Cairo device for plotting during the session R> library(Cairo) # load Cairo, grid and png libraries  R> library(grid) R> library(png)  R> res <- ore.doEval(function()image(volcano,col=terrain.colors(30))) # create embedded R plot  R> img <- ore.pull(res, graphics = TRUE)$img[[1]] # extract image  R> grid.raster(as.raster(readPNG(img)), interpolate = FALSE) # generate raster graph R> dev.off() # turn off first device   By default, the interpolate argument to grid.raster is TRUE, which means that what is actually drawn by R is a linear interpolation of the pixels in the original image. Setting interpolate to FALSE uses a sample from the pixels in the original image.A list of graphics devices available in R can be found in the Devices help file from the grDevices package: R> help(Devices)

    Read the article

  • 3D architecture app for Android or iPhone

    - by Manixate
    I want to make an app for 3D modeling on iPhone/Android. I cannot get the basic idea of how to get started. I have various options such as learning OpenGL ES, UDK or Unity3d but I want to create models(e.g architecture etc) in my app and then render them when user is finished modeling. I do not know if I am able to design models and then render them in the same app with various effects on the iPhone/Android using UDK or Unity3d. (Note: If you find this question unclear please ask, I may have skipped some vital information).

    Read the article

  • Snooker Android Application [closed]

    - by Rzarect
    I am working currently on my final year project / dissertation for the university and I have a "crazy" idea for it. I was thinking of designing an android app for Snooker players, different bars or tournaments, an app that will use the mobile camera to detect every movement and change on the table and in the same time will keep the score for the players without any human input. I want to know if it is an impossible thing. If it is plausible I really need some ideas, advices from where to start. I got to say that I have some experience in Android development and I already started to read a lot of articles, projects about the shape detection, color detection and edge detection.

    Read the article

  • Google accidentally revealed Nexus 5 phone in Android Kitkat launch video

    - by Gopinath
    Today Google officially announced KitKat as the name of the next version of Android OS(v4.4). They posted a video to showcase the unveiling of the KitKat statue at the company’s headquarters, but they accidentally revealed much more. 9to5Google spotted an unreleased Android smartphone in the video and they posted screen grabs of Googlers using it to take photos. This unreleased phone could be the next version of Google Nexus phone. Google few days ago reduced price of Nexus 4 smartphones by $100 and 8 GB version is reported out of inventory. With all these signs it seems to be clear that it is just weeks away for the launch of Google Nexus smartphone. Soon after the news of leaked smartphone spread on blogs, Google pulled Kitkat launch video to private. But few bloggers managed to save a copy of the video and uploaded to other video sharing websites. Check the embedded video

    Read the article

  • Android Application for Final Year Project [closed]

    - by user1070241
    I hope this is the right place to post this question. Basically, I'm about to choose a Final Year Project for my third and final year in BSc Computer Science. I have worked with different apps and therefore I do have some experience with the Android SDK Platform in general. However, my question is this, how do you think an Android based project would go down with potential employers? I personally don't think the complexity of this project is lower than other projects proposed by my university. Please let me know what you think, and do share any experiences that you have had with this, if any. Thank you very much.

    Read the article

  • Android opengles 2.0 :different resolutions rendering and input

    - by kkan
    I'm currently developing a sprite based 2D game for android using opengles 2.0. I've got some basic rendering done that mimics the spritebatch functionality of xna (draw sprite, rotation, color). But all of this works for a fixed projection matrix, but android has a lot of screen sizes. Q1)Would this be an okay method to scale up/down the drawing? 1)Draw the whole screen to a texture. 2)Draw the above texture as a quad to the device. I found the above through some searching, not sure if it's the best one, are there any alternatives? Q2)How do you handle inputs for different resolutions? I currently get the position of a touch and use it raw. Would it be okay to get the position, and then scale the position to size of the texture used for rendering, and the perform calculations on it? Thanks.

    Read the article

  • Starting android Development

    - by Tamim Ad Dari
    I am considering learning android development. I have some basic knowledge in C++. I downloaded the ADT plugin and eclipse. Now while starting from http://developers.android.com I see the codes were in XML. So I googled for learning XML. The best site I found was http://www.w3schools.org but there I found that for learning XML I have to learn HTML and CSS. So I learned the basics of HTML and CSS. But, Now I find in that learning java is a must. So can someone give idea about the sequental languages that I should study now? Should I learn php,mysql too. BTW, I have a dream to work in google :p

    Read the article

  • Ad-hoc network(hot spot created) not detected by android phone

    - by Nirmik
    I created a hot spot via network wireless use as hotspot. Other laptops successfully detected the hot-spot,could connect and use it without any problem. But no android phone could detect it.! I tried with many android phones but none could.(I found out that even an ad-hoc created in windows is not detected by any phone) So is there anything wrong i am doing(rather i am not doing anything actually other than clicking on the "use as hotspt" button..but still..)? or is that the network cannot be shared with phones? and if not,how can i share it with phones?

    Read the article

  • Android loading screens blocking, good practice?

    - by Oren
    I've noticed many (if not all) android games don't support the "back" button functionality during their loading screens. Which leads to some frustrating moments when a user accidentally starts up the game and has to wait for the long loading stage to end in order to close it. So my questions are: 1) Why is that ? Is there a good reason to avoid something like asynchronous loading (or some other solution to this problem) in android games ? 2) If there is no good reason not to support this functionality, what would be the best way to accomplish it ?

    Read the article

  • Replace Android on HP Slate 21?

    - by user288626
    I want to replace Android with Ubuntu for Androids on my new HP Slate 21. As far as I known none has ever tried it yet and I would appreciate if Ask Ubuntu would help out here. Would I need to unlock & root first my machine? (In which case I will need a second machine plugged to my Slate...) Or can I just insert the bootable usb with the distribution I need and go from there as in a normal non-optical drive situation? Also where can I find the downloadable file/image of Ubuntu for Android as describe here? I see no link with the name on it.. Should I just download the normal dis and choose from options when installing it?

    Read the article

  • How many threads should an Android game use?

    - by kvance
    At minimum, an OpenGL Android game has a UI thread and a Renderer thread created by GLSurfaceView. Renderer.onDrawFrame() should be doing a minimum of work to get the higest FPS. The physics, AI, etc. don't need to run every frame, so we can put those in another thread. Now we have: Renderer thread - Update animations and draw polys Game thread - Logic & periodic physics, AI, etc. updates UI thread - Android UI interaction only Since you don't ever want to block the UI thread, I run one more thread for the game logic. Maybe that's not necessary though? Is there ever a reason to run game logic in the renderer thread?

    Read the article

  • Android game scrolling background

    - by Stevanicus
    Hi There, I'm just trying to figure out the best approach for running a scolling background on an android device. The method I have so far.... its pretty laggy. I use threads, which I believe is not the best bet for android platforms @Override public void run() { // Game Loop while(runningThread){ //Scroll background down bgY += 1; try { this.postInvalidate(); t.sleep(10); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } } } where postinvalidate in the onDraw function simply pushings the background image down canvas.drawBitmap(backgroundImage, bgX, bgY, null); Thanks in advance

    Read the article

  • Multiple weapons for android game

    - by Z3r0
    I am trying to make a 3D game for android using the Rajawali engine to render the 3D graphics and blender for designing my models(exporting as .md2), and I want my character to be able to change weapons, armor, helm, etc. Rendering every possible animation would be too much: if I had 10 different weapons, 10 armor and 10 helm, I would have to create 1000 animations with every possible equipment and if I add boots to list it would be even worse. I read somewhere you can use bones for this; but in Android, I only get the object itself to work with. Does anyone has an idea how i can solve this? If I make the weapon a different object how do I parent it to my models in my game?

    Read the article

  • Wi-Fi triangulation using android smartphone

    - by user1887020
    How to make application for wifi triangulation using android platform? This project will be implemented inside the building. No GPS needed. Just using wifi only and doing triangulation to get the current position of the user inside the building. I got minimum 3 access point to implement it. But how to start code in android and integrate triangulation inside android coding? I got the algorithm to do it.. but is there any chance that I can get it done? Because this project is actually want to replace the floor directory board into a smartphone floor directory so that user can find their way to their room for example to the lab. public class Triangulation { public Triangulation() { int dist_1, dist_2, dist_3; //variable for the distances int x1, x2, x3; //coordinates of x int y1, y2, y3; //coordinates of y int final_dist1, final_dist2; //final distance after calc dist_1 = 1; dist_2 = 2; dist_3 = 3; x1 = 5; //test inputs x2 = 2; x3 = 4; y1 = 2; y2 = 2; y3 = 5; final_dist1 = ((dist_1 * dist_1) - (dist_2 * dist_2) – (x1 * x1) + (x2 * x2) - (y1 * y1) + (y2 * y2)) / 2; final_dist2 = ((dist_2 * dist_2) - (dist_3 * dist_3) – (x2 * x2) + (x3 * x3) - (y2 * y2) + (y3 * y3)) / 2; initial_a1 = x1 - x2; initial_a2 = x2 - x3; initial_b1 = y1 - y2; initial_b2 = y2 - y3; //-----------------------STEP 1-------------------------------------- int a1 = initial_a1 / initial_a1; int a2 = initial_a2 / initial_a1; int b1 = initial_b1 / initial_a1; int b2 = initial_b2 / initial_a1; final_dist1 /= initial_a1; final_dist2 /= initial_a1; //-----------------------STEP 2-------------------------------------- a2 = a2 -a2; final_dist2 = -(initial_a2) * final_dist1 + final_dist2; //-----------------------STEP 3-------------------------------------- a2 /= b2; final_dist2 = final_dist2 / b2; b2 /= b2; //-------------------------STEP 4----------------------------------- b1 = b1 - b1; final_dist1 = -(initial_b1) * final_dist2 + final_dist1; } }

    Read the article

  • LIbgdx and android scaling

    - by petervaz
    Following my previous question, I decided to migrate my andengine game to libdgx to have the desktop option. The game assets were planned, at first, to use a 1080x600 resolution and I raised that to 1200x800 which is native for many tablets and would look better on monitors. I followed this blog aproach regarding aspect ratio, which worked nicely on the desktop version, but running the android version (on my smaller tablet), the background would still appear on the original size being cropped by the smaller screen size. How can I force the resize of the background (or for what matter, of everything) on android to fit the screen?

    Read the article

  • Supporting early versions of Android

    - by Philip Sheard
    What policy do developers have when it comes to supporting earlier versions of Android? I still support Android 2.1 and above, but this means that I am unable to use features such as the action bar. Over 40% of my users are still running versions below 3.0, so I feel somewhat constrained about this. The problem is that 3.x was not very successful, so 2.3.x will be with us for some time. But all new devices will now be shipping with 4.x. I am wondering whether 4.x users are more likely to pay for an app, while most 2.3.x users are just looking.

    Read the article

  • android application that visualize real time data

    - by matarsak
    I want to build and android app that visualize real time data , I set up a UDP channel that get the data , now I want to visualize it . I now that I can use openGL ES. but I have now back ground and in a few weeks I dont think that i'm able to learn that . what about android processing ? could it be used for extensive visualization task like this? or it's limited ? I heard it's not hard to learn that. any other option ?

    Read the article

  • Android best source codes [on hold]

    - by lynndragon
    1) I would like to know best and simple android source code sites or forum for game development.. Especially, animation, graphics are needed.. 2) By the way, I'm now learning Adobe Air for Android ... Is it useful? I mean Adobe Air do not need to know programming knowledge..but it's simple.. Weakness of Adobe Air apps are that AdobeAir.apk must be installed...If not, they cannot run.. So,how is yours suggestions? Please answer me....Regards

    Read the article

  • Android–Supporting Multiple Devices Part 2

    - by rodelljr
    Originally posted on: http://geekswithblogs.net/rodelljr/archive/2014/06/03/androidsupporting-multiple-devices-part-2.aspxI would like to thank the KC Mobile App Developers group for allowing me to do my presentation; Supporting Multiple Devices Part 2. In this talk, I discussed a Master/Detail application that runs on Android 2.2 devices and forward. I used the Actionbar compat Android library to allow the older devices to have an actionbar. I also used a SQLite database for my data. And just to add one last thing, I also incorporated the use of custom fonts. If you are interesting in looking at this sample application, I have uploaded it to my GitHub account here. I also have a Power Point presentation which you can get here.I will be doing this talk again at a later date this year, so when I have a date and location, I will post a follow up blog.

    Read the article

  • Android: OutOfMemoryError while uploading video...

    - by AP257
    Hi all, I have the same problem as described here, but I will supply a few more details. While trying to upload a video in Android, I'm reading it into memory, and if the video is large I get an OutOfMemoryError. Here's my code: // get bytestream to upload videoByteArray = getBytesFromFile(cR, fileUriString); public static byte[] getBytesFromFile(ContentResolver cR, String fileUriString) throws IOException { Uri tempuri = Uri.parse(fileUriString); InputStream is = cR.openInputStream(tempuri); byte[] b3 = readBytes(is); is.close(); return b3; } public static byte[] readBytes(InputStream inputStream) throws IOException { ByteArrayOutputStream byteBuffer = new ByteArrayOutputStream(); // this is storage overwritten on each iteration with bytes int bufferSize = 1024; byte[] buffer = new byte[bufferSize]; int len = 0; while ((len = inputStream.read(buffer)) != -1) { byteBuffer.write(buffer, 0, len); } return byteBuffer.toByteArray(); } And here's the traceback (the error is thrown on the byteBuffer.write(buffer, 0, len) line): 04-08 11:56:20.456: ERROR/dalvikvm-heap(6088): Out of memory on a 16775184-byte allocation. 04-08 11:56:20.456: INFO/dalvikvm(6088): "IntentService[UploadService]" prio=5 tid=17 RUNNABLE 04-08 11:56:20.456: INFO/dalvikvm(6088): | group="main" sCount=0 dsCount=0 s=N obj=0x449a3cf0 self=0x38d410 04-08 11:56:20.456: INFO/dalvikvm(6088): | sysTid=6119 nice=0 sched=0/0 cgrp=default handle=4010416 04-08 11:56:20.456: INFO/dalvikvm(6088): at java.io.ByteArrayOutputStream.expand(ByteArrayOutputStream.java:~93) 04-08 11:56:20.456: INFO/dalvikvm(6088): at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:218) 04-08 11:56:20.456: INFO/dalvikvm(6088): at com.android.election2010.UploadService.readBytes(UploadService.java:199) 04-08 11:56:20.456: INFO/dalvikvm(6088): at com.android.election2010.UploadService.getBytesFromFile(UploadService.java:182) 04-08 11:56:20.456: INFO/dalvikvm(6088): at com.android.election2010.UploadService.doUploadinBackground(UploadService.java:118) 04-08 11:56:20.456: INFO/dalvikvm(6088): at com.android.election2010.UploadService.onHandleIntent(UploadService.java:85) 04-08 11:56:20.456: INFO/dalvikvm(6088): at android.app.IntentService$ServiceHandler.handleMessage(IntentService.java:30) 04-08 11:56:20.456: INFO/dalvikvm(6088): at android.os.Handler.dispatchMessage(Handler.java:99) 04-08 11:56:20.456: INFO/dalvikvm(6088): at android.os.Looper.loop(Looper.java:123) 04-08 11:56:20.456: INFO/dalvikvm(6088): at android.os.HandlerThread.run(HandlerThread.java:60) 04-08 11:56:20.467: WARN/dalvikvm(6088): threadid=17: thread exiting with uncaught exception (group=0x4001b180) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): Uncaught handler: thread IntentService[UploadService] exiting due to uncaught exception 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): java.lang.OutOfMemoryError 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at java.io.ByteArrayOutputStream.expand(ByteArrayOutputStream.java:93) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:218) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at com.android.election2010.UploadService.readBytes(UploadService.java:199) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at com.android.election2010.UploadService.getBytesFromFile(UploadService.java:182) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at com.android.election2010.UploadService.doUploadinBackground(UploadService.java:118) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at com.android.election2010.UploadService.onHandleIntent(UploadService.java:85) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at android.app.IntentService$ServiceHandler.handleMessage(IntentService.java:30) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at android.os.Handler.dispatchMessage(Handler.java:99) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at android.os.Looper.loop(Looper.java:123) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at android.os.HandlerThread.run(HandlerThread.java:60) 04-08 11:56:20.496: INFO/Process(4657): Sending signal. PID: 6088 SIG: 3 I guess that as @DroidIn suggests, I need to upload it in chunks. But (newbie question alert) does that mean that I should make multiple PostMethod requests, and glue the file together at the server end? Or can I load the bytestream into memory in chunks, and glue it together in the Android code? If anyone could give me a clue as to the best approach, I would be very grateful.

    Read the article

  • Failure in Yahoo Authentication in Android

    - by Jayson Tamayo
    I'm trying to integrate Yahoo into my application. I want them to login using their Yahoo accounts because I will be needing their names later in the application. But whenever I request for a token, I receive the following errors: getRequestToken() Exception: oauth.signpost.exception.OAuthCommunicationException: Communication with the service provider failed: Service provider responded in error: 400 (Bad Request) Here is my code (Request_Token_Activity.java): import oauth.signpost.OAuth; import oauth.signpost.OAuthConsumer; import oauth.signpost.OAuthProvider; import oauth.signpost.commonshttp.CommonsHttpOAuthConsumer; import oauth.signpost.commonshttp.CommonsHttpOAuthProvider; import oauth.signpost.signature.HmacSha1MessageSigner; import android.app.Activity; import android.content.Intent; import android.content.SharedPreferences; import android.content.SharedPreferences.Editor; import android.net.Uri; import android.os.Bundle; import android.preference.PreferenceManager; import android.util.Log; public class Request_Token_Activity extends Activity { private OAuthConsumer consumer; private OAuthProvider provider; private SharedPreferences prefs; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); try { consumer = new CommonsHttpOAuthConsumer("my consumer key", "my consumer secret"); consumer.setMessageSigner(new HmacSha1MessageSigner()); provider = new CommonsHttpOAuthProvider( "http://api.login.yahoo.com/oauth/v2/get_request_token", "http://api.login.yahoo.com/oauth/v2/get_token", "http://api.login.yahoo.com/oauth/v2/request_auth"); } catch (Exception e) { Log.e("", "onCreate Exception: " + e.toString()); } getRequestToken(); } private void getRequestToken() { try { String url = provider.retrieveRequestToken(consumer, "yahooapi://callback"); Log.i("", "Yahoo URL: " + url); Intent intent = new Intent(Intent.ACTION_VIEW, Uri.parse(url)).setFlags(Intent.FLAG_ACTIVITY_SINGLE_TOP | Intent.FLAG_ACTIVITY_NO_HISTORY | Intent.FLAG_FROM_BACKGROUND); this.startActivity(intent); } catch (Exception e) { Log.i("", "getRequestToken() Exception: " + e.toString()); } } @Override public void onNewIntent(Intent intent) { super.onNewIntent(intent); prefs = PreferenceManager.getDefaultSharedPreferences(this); final Uri uri = intent.getData(); if (uri != null && uri.getScheme().equals("yahooapi")) { getAccessToken(uri); } } private void getAccessToken(Uri uri) { final String oauth_verifier = uri.getQueryParameter(OAuth.OAUTH_VERIFIER); try { provider.retrieveAccessToken(consumer, oauth_verifier); final Editor edit = prefs.edit(); edit.putString("YAHOO_OAUTH_TOKEN", consumer.getToken()); edit.putString("YAHOO_OAUTH_TOKEN_SECRET", consumer.getTokenSecret()); edit.commit(); String token = prefs.getString("YAHOO_OAUTH_TOKEN", ""); String secret = prefs.getString("YAHOO_OAUTH_TOKEN_SECRET", ""); consumer.setTokenWithSecret(token, secret); Log.i("", "Yahoo OAuth Token: " + token); Log.i("", "Yahoo OAuth Token Secret: " + token); } catch (Exception e) { Log.i("", "getAccessToken Exception: " + e.toString()); } } } And this is a snapshot of my AndroidManifest.xml: <activity android:name="Request_Token_Activity" android:launchMode="singleTask"> <intent-filter> <action android:name="android.intent.action.VIEW" /> <category android:name="android.intent.category.DEFAULT" /> <category android:name="android.intent.category.BROWSABLE" /> <data android:scheme="yahooapi" android:host="callback" /> </intent-filter> </activity> I have set-up my Yahoo Project as a Web Application and put Read and Write access to Social and Contacts. What am I doing wrong?

    Read the article

  • Android onClickListener options and help on creating a non-static array adapter

    - by CoderInTraining
    I am trying to make an application that gets data dynamically and displays it in a listView. Here is my code that I have with static data. There are a couple of things I want to try and do and can't figure out how. MainActivity.java package text.example.project; import java.util.ArrayList; import java.util.Arrays; import java.util.Collections; import android.app.ListActivity; import android.os.Bundle; import android.view.View; import android.widget.AdapterView; import android.widget.AdapterView.OnItemClickListener; import android.widget.ArrayAdapter; import android.widget.ListView; import android.widget.TextView; import android.widget.Toast; public class MainActivity extends ListActivity { //declarations private boolean isItem; private ArrayAdapter<String> item1Adapter; private ArrayAdapter<String> item2Adapter; private ArrayAdapter<String> item3Adapter; /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); Collections.sort(ITEM1); Collections.sort(ITEM2); Collections.sort(ITEM3); item1Adapter = new ArrayAdapter<String>(this, R.layout.list_item, ITEM1); item2Adapter = new ArrayAdapter<String>(this, R.layout.list_item, ITEM2); item3Adapter = new ArrayAdapter<String>(this, R.layout.list_item, ITEM3); setListAdapter(item1Adapter); isItem = true; ListView lv = getListView(); lv.setTextFilterEnabled(true); lv.setOnItemClickListener(new OnItemClickListener() { @Override public void onItemClick(AdapterView<?> parent, View view, int position, long id) { // When clicked, show a toast with the TextView text if (isItem) { //ITEM1.add("another\n\t" + Calendar.getInstance().getTime()); Collections.sort(ITEM1); item2Adapter.notifyDataSetChanged(); setListAdapter(item2Adapter); isItem = false; } else { item1Adapter.notifyDataSetChanged(); setListAdapter(item1Adapter); isItem = true; } Toast.makeText(getApplicationContext(), ((TextView) view).getText(), Toast.LENGTH_SHORT).show(); } }); } // need to turn dynamic static ArrayList<String> ITEM1 = new ArrayList<String> (Arrays.asList( "ITEM1-1", "ITEM1-2" )); static ArrayList<String> ITEM2 = new ArrayList<String> (Arrays.asList( "ITEM2-1", "ITEM2-2" )); static ArrayList<String> ITEM3 = new ArrayList<String> (Arrays.asList("ITEM3-1", "ITEM3-2")); } list_item.xml <?xml version="1.0" encoding="utf-8"?> <TextView xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="fill_parent" android:layout_height="fill_parent" android:padding="10dp" android:textSize="16sp" > </TextView> What I want to do is first pull from a dynamic source. I need to do what is almost exactly like this tutorial... http://androiddevelopement.blogspot.in/2011/06/android-xml-parsing-tutorial-using.html ... however, I can't get the tutorial to work... I also would like to know how I can make the list item clicks so that if I click on "ITEM1-1" it goes to the menu "ITEM2-1" and "ITEM2-2". and if "ITEM1-2" is clicked, then it goes to the menu with "ITEM3-1" and "ITEM3-2"... I am totally a noob at this whole android development thing. So any suggestions would be great!

    Read the article

  • Rtsp Live Stream Android

    - by Filiz Gökçe
    I try to make live stream on android, I try lots of ways, but none of them doesnt work. Could you help me ? This is example of rtsp; mMediaPlayer = new MediaPlayer(); mMediaPlayer.setDataSource(KralStream.getTvStreamUrl().toString()); mMediaPlayer.setDisplay(holder); mMediaPlayer.prepareAsync(); mMediaPlayer.setOnBufferingUpdateListener(this); mMediaPlayer.setOnCompletionListener(this); mMediaPlayer.setOnPreparedListener(this); mMediaPlayer.setOnVideoSizeChangedListener(this); mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); mMediaPlayer.setLooping(true); Exeption :05-26 10:22:46.186: ERROR/MediaPlayerService(10157): create PVPlayer 05-26 10:23:06.382: ERROR/PlayerDriver(10157): Command PLAYER_INIT completed with an error or info -1 05-26 10:23:06.382: ERROR/MediaPlayer(23800): error (1, -1) 05-26 10:23:06.382: ERROR/MediaPlayer(23800): Error (1,-1) rtsp; VideoView videoView=(VideoView)findViewById(R.id.videoView1); Uri uri = Uri.parse("rtsp://strm-3.tr.medianova.tv/rkraltv/rkraltv"); videoView.setVideoURI(uri); videoView.start(); Gives message;"Sorry, this video connot ve played." Exeptions;05-26 10:40:08.979: ERROR/MediaPlayerService(10157): create PVPlayer 05-26 10:40:09.188: INFO/ActivityManager(10163): Displayed activity com.giantrabbit.nagare/.KralTvNow: 433 ms (total 433 ms) 05-26 10:40:11.702: WARN/PowerManagerService(10163): Timer 0x3-0x3|0x1 05-26 10:40:29.061: WARN/MediaPlayer(24284): info/warning (1, 26) 05-26 10:40:29.061: INFO/MediaPlayer(24284): Info (1,26) 05-26 10:40:29.100: ERROR/PlayerDriver(10157): Command PLAYER_INIT completed with an error or info -1 05-26 10:40:29.104: ERROR/MediaPlayer(24284): error (1, -1) 05-26 10:40:29.108: ERROR/MediaPlayer(24284): Error (1,-1) rtsp; mPreview = (SurfaceView) findViewById(R.id.surface); holder = mPreview.getHolder(); holder.addCallback(this); holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); extras = getIntent().getExtras(); public void play() { try { Uri video = KralStream.getTvStreamUrl(); Toast.makeText(this, video.toString(), Toast.LENGTH_SHORT).show(); mMediaPlayer = new MediaPlayer(); mMediaPlayer.setDataSource(path); mMediaPlayer.setDisplay(holder); mMediaPlayer.prepare(); mMediaPlayer.setOnBufferingUpdateListener(this); mMediaPlayer.setOnCompletionListener(this); mMediaPlayer.setOnPreparedListener(this); mMediaPlayer.setOnVideoSizeChangedListener(this); mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); } catch (Exception e) { Log.e(TAG, "error: " + e.getMessage(), e); } } Exeption ; 05-26 10:36:57.589: ERROR/MediaPlayerService(10157): create PVPlayer 05-26 10:37:20.542: ERROR/PlayerDriver(10157): Command PLAYER_INIT completed with an error or info -1 05-26 10:37:20.542: ERROR/MediaPlayer(24240): error (1, -1) 05-26 10:37:20.565: WARN/PlayerDriver(10157): PVMFInfoErrorHandlingComplete 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): error: Prepare failed.: status=0x1 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): java.io.IOException: Prepare failed.: status=0x1 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.media.MediaPlayer.prepare(Native Method) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at com.giantrabbit.nagare.KralTvNow.play(KralTvNow.java:162) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at com.giantrabbit.nagare.KralTvNow.surfaceCreated(KralTvNow.java:215) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.view.SurfaceView.updateWindow(SurfaceView.java:536) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.view.SurfaceView.dispatchDraw(SurfaceView.java:339) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.view.ViewGroup.drawChild(ViewGroup.java:1638) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.view.ViewGroup.dispatchDraw(ViewGroup.java:1367) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.view.ViewGroup.drawChild(ViewGroup.java:1638) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.view.ViewGroup.dispatchDraw(ViewGroup.java:1367) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.view.View.draw(View.java:6796) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.widget.FrameLayout.draw(FrameLayout.java:352) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.view.ViewGroup.drawChild(ViewGroup.java:1640) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.view.ViewGroup.dispatchDraw(ViewGroup.java:1367) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.view.View.draw(View.java:6796) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.widget.FrameLayout.draw(FrameLayout.java:352) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at com.android.internal.policy.impl.PhoneWindow$DecorView.draw(PhoneWindow.java:1894) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.view.ViewRoot.draw(ViewRoot.java:1407) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.view.ViewRoot.performTraversals(ViewRoot.java:1163) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.view.ViewRoot.handleMessage(ViewRoot.java:1727) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.os.Handler.dispatchMessage(Handler.java:99) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.os.Looper.loop(Looper.java:123) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at android.app.ActivityThread.main(ActivityThread.java:4627) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at java.lang.reflect.Method.invokeNative(Native Method) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at java.lang.reflect.Method.invoke(Method.java:521) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:871) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:629) 05-26 10:37:20.682: ERROR/MediaPlayerDemo(24240): at dalvik.system.NativeStart.main(Native Method) 05-26 10:37:20.737: INFO/MediaPlayer(24240): Info (1,26) 05-26 10:37:20.737: ERROR/MediaPlayer(24240): Error (1,-1) 05-26 10:37:20.868: INFO/ActivityManager(10163): Displayed activity com.giantrabbit.nagare/.KralTvNow: 25864 ms (total 25864 ms) 05-26 10:37:23.777: WARN/PowerManagerService(10163): Timer 0x3-0x3|0x1 This is example of http ; mMediaPlayer = new MediaPlayer(); mMediaPlayer.setDataSource("http://ikral.garantisistem.com:1935/ikral/smil:kral.smil/playlist.m3u8"); mMediaPlayer.setDisplay(holder); mMediaPlayer.prepareAsync(); mMediaPlayer.setOnBufferingUpdateListener(this); mMediaPlayer.setOnCompletionListener(this); mMediaPlayer.setOnPreparedListener(this); mMediaPlayer.setOnVideoSizeChangedListener(this); mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); mMediaPlayer.setLooping(true); Exeption: 05-26 10:16:24.276: ERROR/MediaPlayerService(10157): create PVPlayer 05-26 10:16:24.292: ERROR/(10157): IIIIIII Inside Constructor of PVMFMemoryBufferWriteDataStreamImpl 05-26 10:16:24.346: INFO/PlayerDriver(10157): buffering (100) 05-26 10:16:24.346: ERROR/(10157): IIIIIII Inside Constructor of PVMFMemoryBufferReadDataStreamImpl 05-26 10:16:24.346: ERROR/(10157): IIIIIII Inside Constructor of PVMFMemoryBufferReadDataStreamImpl 05-26 10:16:24.346: ERROR/(10157): IIIIIII Inside Constructor of PVMFMemoryBufferReadDataStreamImpl 05-26 10:16:24.346: ERROR/(10157): IIIIIII Inside Constructor of PVMFMemoryBufferReadDataStreamImpl 05-26 10:16:24.346: ERROR/(10157): IIIIIII Inside Constructor of PVMFMemoryBufferReadDataStreamImpl 05-26 10:16:24.346: ERROR/(10157): IIIIIII Inside Constructor of PVMFMemoryBufferReadDataStreamImpl 05-26 10:16:24.346: ERROR/(10157): IIIIIII Inside Constructor of PVMFMemoryBufferReadDataStreamImpl 05-26 10:16:24.346: ERROR/(10157): IIIIIII Inside Constructor of PVMFMemoryBufferReadDataStreamImpl 05-26 10:16:24.346: ERROR/(10157): IIIIIII Inside Constructor of PVMFMemoryBufferReadDataStreamImpl 05-26 10:16:24.346: ERROR/(10157): IIIIIII Inside Constructor of PVMFMemoryBufferReadDataStreamImpl 05-26 10:16:24.346: ERROR/(10157): IIIIIII Inside Constructor of PVMFMemoryBufferReadDataStreamImpl 05-26 10:16:24.346: ERROR/(10157): IIIIIII Inside Constructor of PVMFMemoryBufferReadDataStreamImpl 05-26 10:16:24.350: WARN/MediaPlayer(23736): info/warning (1, 26) 05-26 10:16:24.354: ERROR/PlayerDriver(10157): Command PLAYER_INIT completed with an error or info -10 05-26 10:16:24.354: ERROR/MediaPlayer(23736): error (-10, -10) 05-26 10:16:24.354: WARN/PlayerDriver(10157): PVMFInfoErrorHandlingComplete 05-26 10:16:24.393: INFO/MediaPlayer(23736): Info (1,26) 05-26 10:16:24.393: ERROR/MediaPlayer(23736): Error (-10,-10) Htttp; VideoView videoView=(VideoView)findViewById(R.id.videoView1); Uri uri = Uri.parse("http://ikral.garantisistem.com:1935/ikral/smil:kral.smil/playlist.m3u8"); videoView.setVideoURI(uri); videoView.start(); Gives message;"Sorry, this video connot ve played." Filiz Gökçe

    Read the article

  • Android beginner: Touch events in android gridview

    - by jja
    I am using the following code to do things with gridview(slightly modified from http://developer.android.com/resources/tutorials/views/hello-gridview.html). I want to replace the onClicklistener and the onClick() method with their "touch" equivalents i.e. touchlistener and onTouch() so that when i touch an element in the gridview the image of the element changes and a double touch on the same element takes it back to the orginal state. How do I do this? I can't get my code to do this. The clicklistener works to some extent but the touch isn't. Please help. public class ImageAdapter extends BaseAdapter { private Context mContext; public ImageAdapter(Context c) { mContext = c; } public int getCount() { return mThumbIds.length; } public Object getItem(int position) { return null; } public long getItemId(int position) { return 0; } // create a new ImageView for each item referenced by the Adapter public View getView(int position, View convertView, ViewGroup parent) { ImageView imageView; if (convertView == null) { // if it's not recycled, initialize some attributes imageView = new ImageView(mContext); imageView.setLayoutParams(new GridView.LayoutParams(85, 85)); imageView.setScaleType(ImageView.ScaleType.CENTER_CROP); imageView.setPadding(8, 8, 8, 8); imageView.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View view) { if(position==0) { //do this } else { //do this } } }); } else { imageView = (ImageView) convertView; } imageView.setImageResource(mThumbIds[position]); return imageView; } // references to our images private Integer[] mThumbIds = { R.drawable.sample_2, R.drawable.sample_3, R.drawable.sample_4, R.drawable.sample_5, R.drawable.sample_6, R.drawable.sample_7, R.drawable.sample_0, R.drawable.sample_1, R.drawable.sample_2, R.drawable.sample_3, R.drawable.sample_4, R.drawable.sample_5, R.drawable.sample_6, R.drawable.sample_7, R.drawable.sample_0, R.drawable.sample_1, R.drawable.sample_2, R.drawable.sample_3, R.drawable.sample_4, R.drawable.sample_5, R.drawable.sample_6, R.drawable.sample_7 }; }

    Read the article

< Previous Page | 50 51 52 53 54 55 56 57 58 59 60 61  | Next Page >