Search Results

Search found 15838 results on 634 pages for 'android layout'.

Page 127/634 | < Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >

  • Office design and layout for agile development

    - by Adam Eberbach
    (moved from stackoverflow) I have found lot of discussions here on about which keyboard, desk, light or colored background is best - but I can't find one addressing the layout of the whole office. We are a company with about 20 employees moving to a new place, something larger. There are two main development practices going on here with regular combination, the back end people often needing to work with the mobile people to arrange web services. There are about twice as many back end people as mobile people. About half of the back end developers are working on-site at any time and while they are almost never all in the office at once at least 5-10 spaces need to be provided - so most of the time the two groups are about equal. We have the chance to arrange desks, partitions and possibly even walls to make the space good. There won't be cash for dot-com frills like catering or massages but now's the time to be planning to avoid ending up with a bunch of desks in a long line. Joel on Software's Bionic Office is an article I've remembered from way back and it has some good ideas but I* (and more importantly the company's owners) are not completely sold on the privacy idea in an environment where we are supposed to be collaborating. This is another great link - The Ultimate Software Development Office Layout - I hadn't even remembered enclosed meeting rooms until reading this. Does the private office stand in the way of agile development? Is the scrum enough forced contact and if you need to bug someone you should need to get up and knock on their door? What design layouts can you point to and why would you recommend them? *I'm not against closed offices at all but would be happy if some other solution can do just as well. If it can't... well, that's what this question is all about.

    Read the article

  • How to use Android's CacheManager?

    - by punnie
    I'm currently developing an Android application that fetches images using http requests. It would be quite swell if I could cache those images in order to improve to performance and bandwidth use. I came across the CacheManager class in the Android reference, but I don't really know how to use it, or what it really does. I already scoped through this example, but I need some help understanding it: /core/java/android/webkit/gears/ApacheHttpRequestAndroid.java Also, the reference states: "Network requests are provided to this component and if they can not be resolved by the cache, the HTTP headers are attached, as appropriate, to the request for revalidation of content." I'm not sure what this means or how it would work for me, since CacheManager's getCacheFile accepts only a String URL and a Map containing the headers. Not sure what the attachment mentioned means. An explanation or a simple code example would really do my day. Thanks! Update Here's what I have right now. I am clearly doing it wrong, just don't know where. public static Bitmap getRemoteImage(String imageUrl) { URL aURL = null; URLConnection conn = null; Bitmap bmp = null; CacheResult cache_result = CacheManager.getCacheFile(imageUrl, new HashMap()); if (cache_result == null) { try { aURL = new URL(imageUrl); conn = aURL.openConnection(); conn.connect(); InputStream is = conn.getInputStream(); cache_result = new CacheManager.CacheResult(); copyStream(is, cache_result.getOutputStream()); CacheManager.saveCacheFile(imageUrl, cache_result); } catch (Exception e) { return null; } } bmp = BitmapFactory.decodeStream(cache_result.getInputStream()); return bmp; }

    Read the article

  • Scrolling a Canvas smoothly in Android

    - by prepbgg
    I'm new to Android. I am drawing bitmaps, lines and shapes onto a Canvas inside the OnDraw(Canvas canvas) method of my view. I am looking for help on how to implement smooth scrolling in response to a drag by the user. I have searched but not found any tutorials to help me with this. The reference for Canvas seems to say that if a Canvas is constructed from a Bitmap (called bmpBuffer, say) then anything drawn on the Canvas is also drawn on bmpBuffer. Would it be possible to use bmpBuffer to implement a scroll ... perhaps copy it back to the Canvas shifted by a few pixels at a time? But if I use Canvas.drawBitmap to draw bmpBuffer back to Canvas shifted by a few pixels, won't bmpBuffer be corrupted? Perhaps, therefore, I should copy bmpBuffer to bmpBuffer2 then draw bmpBuffer2 back to the Canvas. A more straightforward approach would be to draw the lines, shapes, etc. straight into a buffer Bitmap then draw that buffer (with a shift) onto the Canvas but so far as I can see the various methods: drawLine(), drawShape() and so on are not available for drawing to a Bitmap ... only to a Canvas. Could I have 2 Canvases? One of which would be constructed from the buffer bitmap and used simply for plotting the lines, shapes, etc. and then the buffer bitmap would be drawn onto the other Canvas for display in the View? I should welcome any advice! Answers to similar questions here (and on other websites) refer to "blitting". I understand the concept but can't find anything about "blit" or "bitblt" in the Android documentation. Are Canvas.drawBitmap and Bitmap.Copy Android's equivalents?

    Read the article

  • Making a Text-To-Speech Wrapper in Android

    - by John Montgomery
    I am attempting to create a wrapper class for Google Android's Text-To-Speech functionality. However, I'm having trouble finding a way to have the system pause until after the onInit function has finished. Attached at the bottom is something of a solution I created based on what I found here: http://stackoverflow.com/questions/1160876/android-speech-how-can-you-read-text-in-android However, this solution does not seem to work. Any thoughts on why this might not be working, or what would be a good idea in order to make sure that any Speak() calls happen after my onInit() call? public class SpeechSynth implements OnInitListener { private TextToSpeech tts; static final int TTS_CHECK_CODE = 0; private int ready = 0; private ReentrantLock waitForInitLock = new ReentrantLock(); SpeechSynth( Activity screen ) { ready = 0; tts = new TextToSpeech( screen, this ); waitForInitLock.lock(); } public void onInit(int status) { if (status == TextToSpeech.SUCCESS) { ready = 1; } waitForInitLock.unlock(); } public int Speak( String text ) { if( ready == 1 ) { tts.speak(text, TextToSpeech.QUEUE_ADD, null); return 1; } else { return 0; } } } I have been able to make it so that I can pass a string of text through the constructor, then have it played in the onInit() function. However, I would really like to avoid having to destroy and re-create the whole text-to-speech engine every time I need to have my program say something different.

    Read the article

  • Storing data on SD Card in Android

    - by BBoom
    Using the guide at Android Developers (http://developer.android.com/guide/topics/data/data-storage.html) I've tried to store some data to the SD-Card. This is my code: // Path to write files to String path = Environment.getExternalStorageDirectory().getAbsolutePath() + "/Android/data/"+ctxt.getString(R.string.package_name)+"/files/"; String fname = "mytest.txt"; // Current state of the external media String extState = Environment.getExternalStorageState(); // External media can be written onto if (extState.equals(Environment.MEDIA_MOUNTED)) { try { // Make sure the path exists boolean exists = (new File(path)).exists(); if (!exists){ new File(path).mkdirs(); } // Open output stream FileOutputStream fOut = new FileOutputStream(path + fname); fOut.write("Test".getBytes()); // Close output stream fOut.flush(); fOut.close(); } catch (IOException ioe) { ioe.printStackTrace(); } When I create the new FileOutputStream I get a FileNotFound exception. I have also noticed that "mkdirs()" does not seem to create the directory. Can anyone tell me what I'm doing wrong? I'm testing on an AVD with a 2GB sd card and "hw.sdCard: yes", the File Explorer of DDMS in Eclipse tells me that the only directory on the sdcard is "LOST.DIR".

    Read the article

  • android compile error: could not reserve enough space for object heap

    - by moonlightcheese
    I'm getting this error during compilation: Error occurred during initialization of VM Could not create the Java virtual machine. Could not reserve enough space for object heap What's worse, the error occurs intermittently. Sometimes it happens, sometimes it doesn't. It seems to be dependent on the amount of code in the application. If I get rid of some variables or drop some imported libraries, it will compile. Then when I add more to it, I get the error again. I've included the following sources into the application in the [project_root]/src/ directory: org.apache.httpclient (I've stripped all references to log4j from the sources, so don't need it) org.apache.codec (as a dependency) org.apache.httpcore (dependency of httpclient) and my own activity code consisting of nothing more than an instance of HttpClient. I know this has something to do with the amount of memory necessary during compile time or some compiler options, and I'm not really stressing my system while i'm coding. I've got 2GB of memory on this Core Duo laptop and windows reports only 860MB page file usage (haven't used any other memory tools. I should have plenty of memory and processing power for this... and I'm only compiling some common http libs... total of 406 source files. What gives? edit (4/30/2010-18:24): Just compiled some code where I got the above stated error. I closed some web browser windows and recompiled the same exact code with no edits and it compiled with no issue. this is definitely a compiler issue related to memory usage. Any help would be great.... because I have no idea where to go from here. Android API Level: 5 Android SDK rel 5 JDK version: 1.6.0_12 Sorry I had to repost this question because regardless of whether I use the native HttpClient class in the Android SDK or my custom version downloaded from apache, the error still occurs.

    Read the article

  • Android: Can not send http post

    - by jpartogi
    Hi all, I've been banging my head trying to figure out how to send a post method in Android. This is how my code look like: public class HomeActivity extends Activity implements OnClickListener { private TextView textView; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); textView = (TextView) findViewById(R.id.text); Button button = (Button)findViewById(R.id.button); button.setOnClickListener(this); } @Override public void onClick(View view) { HttpPost httpMethod = new HttpPost("http://www.example.com/"); httpMethod.addHeader("Accept", "text/html"); httpMethod.addHeader("Content-Type", "application/xml"); AndroidHttpClient client = AndroidHttpClient.newInstance("Android"); String result = null; try { HttpResponse response = client.execute(httpMethod); textView.setText(response.toString()); HttpEntity entity = response.getEntity(); Log.i(HomeActivity.class.toString(), result); textView.setText("Invoked webservice"); } catch (IOException e) { e.printStackTrace(); Log.e(HomeActivity.class.toString(), e.getMessage()); textView.setText("Something wrong:" + e.getMessage()); } } } What am I doing wrong here? Is there anything that I may need to configure from the Android emulator to get this working? Thank you for your help.

    Read the article

  • Android Scaled Drawing to ImageView

    - by user329999
    Newbie question, so there's probably a simple answer to this problem. I'm drawing some simple shapes using canvas.drawCircle(), canvas.drawLine() etc. I originally copied the code from: http://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/graphics/DrawPoints.html Which extends a View and draws directly to a canvas. It doesn't load a pre-drawn bitmap because I need my application to turn data into a drawing and the user will enter the data. My changes work, but the drawing is too small (or big) and doesn't fill the screen using all the available screen. Ideally I'd rather use something like an ImageView in .XML like so: If that's possible. The documentation seems to imply that I want to set the scaleType as shown in the above .XML which seems like the simple way to do this. If using an ImageView in .XML is a good idea, then I'm lost on how to draw to the ImageView and could use some guidance on doing that task. If that won't work, then I'll need to do some more thinking about how to get my drawing scaled on the screen and basically I'm lazy and would rather have Android do the work for me. Feel free to suggest some other way that's completely different is this is the wrong solution path. :) Thanks.

    Read the article

  • Android:Playing bigger size audio wav sound file produces crash

    - by user187532
    Hi Android experts, I am trying to play the bigger size audio wav file(which is 20 mb) using the following code(AudioTrack) on my Android 1.6 HTC device which basically has less memory. But i found device crash as soon as it executes reading, writing and play. But the same code works fine and plays the lesser size audio wav files(10kb, 20 kb files etc) very well. P.S: I should play PCM(.wav) buffer sound, the reason behind why i use AudioTrack here. Though my device has lesser memory, how would i read bigger audio files bytes by bytes and play the sound to avoid crashing due to memory constraints. private void AudioTrackPlayPCM() throws IOException { String filePath = "/sdcard/myWav.wav"; // 8 kb file byte[] byteData = null; File file = null; file = new File(filePath); byteData = new byte[(int) file.length()]; FileInputStream in = null; try { in = new FileInputStream( file ); in.read( byteData ); in.close(); } catch (FileNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); } int intSize = android.media.AudioTrack.getMinBufferSize(8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_8BIT); AudioTrack at = new AudioTrack(AudioManager.STREAM_MUSIC, 8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_8BIT, intSize, AudioTrack.MODE_STREAM); at.play(); at.write(byteData, 0, byteData.length); at.stop(); at.release(); } Could someone guide me please to play the AudioTrack code for bigger size wav files?

    Read the article

  • android odbc connection

    - by Vijay Kumar
    i want to connect odbc connection for my android application. Here in my program i'm using oracle database 11g and my table name is sample. After i run the program close the emulator open the database the values could not be stored. Please give one solution or any changes in my program or connection string. package com.odbc; import java.sql.Connection; import java.sql.DriverManager; import java.sql.PreparedStatement; import android.app.Activity; import android.os.Bundle; public class OdbcActivity extends Activity { /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); String first="vijay"; String last="kumar"; try { DriverManager.registerDriver(new oracle.jdbc.driver.OracleDriver()); Connection con=DriverManager.getConnection("jdbc:oracle:thin:@localshot:1521:XE","system","vijay"); PreparedStatement pst=con.prepareStatement("insert into sample(first,last) values(?,?"); pst.setString(1,first); pst.setString(2,last); pst.executeUpdate(); } catch(Exception e) { System.out.println("Exception:"+e); } } }

    Read the article

  • Android Multiple Handlers Design Question

    - by Soumya Simanta
    This question is related to an existing question I asked. I though I'll ask a new question instead of replying back to the other question. Cannot "comment" on my previous question because of a word limit. Marc wrote - I've more than one Handlers in an Activity." Why? If you do not want a complicated handleMessage() method, then use post() (on Handler or View) to break the logic up into individual Runnables. Multiple Handlers makes me nervous. I'm new to Android. Is having multiple handlers in a single activity a bad design ? I'm new to Android. My question is - is having multiple handlers in a single activity a bad design ? Here is the sketch of my current implementation. I've a mapActivity that creates a data thread (a UDP socket that listens for data). My first handler is responsible for sending data from the data thread to the activity. On the map I've a bunch of "dynamic" markers that are refreshed frequently. Some of these markers are video markers i.e., if the user clicks a video marker, I add a ViewView that extends a android.opengl.GLSurfaceView to my map activity and display video on this new vide. I use my second handler to send information about the marker that the user tapped on ItemizedOverlay onTap(int index) method. The user can close the video view by tapping on the video view. I use my third handler for this. I would appreciate if people can tell me what's wrong with this approach and suggest better ways to implement this. Thanks.

    Read the article

  • Design problem with callback functions in android

    - by Franz Xaver
    Hi folks! I'm currently developing an app in android that is accessing wifi values, that is, the application needs to scan for all access point and their specific signal strengths. I know that I have to extend the class BroadcastReceiver overwriting the method BroadcastReceiver.onReceive(Context context, Intent intent) which is called when the values are ready. Perhaps there exist solutions provided by the android system itself but I'm relatively new to android so I could need some help. The problem I encountered is that I got one class (an activity, thus controlled by the user) that needs this scan results for two different things (first to save the values in a database or second, to use them for further calculations but not both at one moment!) So how to design the callback system in order to "transport" the scan results from onReceive(Context context, Intent intent) to the operation intended by the user? My first solution was to define enums for each use case (save or use for calculations) which wlan-interested classes have to submit when querying for the values. But that would force the BroadcastReceiverextending class to save the current enum and use it as a parameter in the callback function of the querying class (this querying class needs to know what it asked for when getting backcalled) But that seems to me kind of dirty ;) So anyone a good idea for this?

    Read the article

  • android:junit running Parameterized testcases.

    - by puneetinderkaur
    Hi, i wan to run a single testcase with different parameters. in java it possible through junit4 and code is like this package com.android.test; import static org.junit.Assert.assertEquals; import java.util.Arrays; import java.util.Collection; import org.junit.Test; import org.junit.runner.RunWith; import org.junit.runners.Parameterized; import org.junit.runners.Parameterized.Parameters; @RunWith(Parameterized.class) public class ParameterizedTestExample { private int number; private int number2; private int result; private functioncall coun; //constructor takes parameters from array public ParameterizedTestExample(int number, int number2, int result1) { this.number = number; this.number2 = number2; this.result = result1; coun = new functioncall(); } //array with parameters @Parameters public static Collection<Object[]> data() { Object[][] data = new Object[][] { { 1, 4, 5 }, { 2, 7, 9 }, { 3, 7, 10 }, { 4, 9, 13 } }; return Arrays.asList(data); } //running our test @Test public void testCounter() { assertEquals("Result add x y", result, coun.calculateSum(this.number, this.number2), 0); } } Now my question is can we Run the same with android testcases where we have deal with context. when i try to run the code using android junit. it gives me error NoClassDefinationFound.

    Read the article

  • Android: changing drawable states of option menu items seems to have side-effects

    - by pjv
    In my onCreateOptionsMenu() I have basically the following: public boolean onCreateOptionsMenu(Menu menu) { menu.add(Menu.NONE, MENU_ITEM_INSERT, Menu.NONE, R.string.item_menu_insert).setShortcut('3', 'a').setIcon(android.R.drawable.ic_menu_add); PackageManager pm = getPackageManager(); if(pm.hasSystemFeature(PackageManager.FEATURE_CAMERA) && pm.hasSystemFeature(PackageManager.FEATURE_CAMERA_AUTOFOCUS)){ menu.add(Menu.NONE, MENU_ITEM_SCAN_ADD, Menu.NONE, ((Collectionista.DEBUG)?"DEBUG Scan and add item":getString(R.string.item_menu_scan_add))).setShortcut('4', 'a').setIcon(android.R.drawable.ic_menu_add); } ... } And in onPrepareOptionsMenu among others the following: final boolean scanAvailable = ScanIntent.isInstalled(this); final MusicCDItemScanAddTask task = new MusicCDItemScanAddTask(this); menu.findItem(MENU_ITEM_SCAN_ADD).setEnabled(scanAvailable && (tasks == null || !existsTask(task))); As you see, two options items have the same drawable set (android.R.drawable.ic_menu_add). Now, if in onPrepareOptionsMenu the second menu item gets disabled, its label and icon become gray, but also the icon of the first menu item becomes gray, while the label of that fist menu item stays black and it remains clickable. What is causing this crosstalk between the two icons/drawables? Shouldn't the system handle things like mutate() in this case? I've included a screenshot:

    Read the article

  • How to stop Android GPS using "Mobile data"

    - by prepbgg
    My app requests location updates with "minTime" set to 2 seconds. When "Mobile data" is switched on (in the phone's settings) and GPS is enabled the app uses "mobile data" at between 5 and 10 megabytes per hour. This is recorded in the ICS "Data usage" screen as usage by "Android OS". In an attempt to prevent this I have unticked Settings-"Location services"-"Google's location service". Does this refer to Assisted GPS, or is it something more than that? Whatever it is, it seems to make no difference to my app's internet access. As further confirmation that it is the GPS usage by my app that is causing the mobile data access I have observed that the internet data activity indicator on the status bar shows activity when and only when the GPS indicator is present. The only way to prevent this mobile data usage seems to be to switch "Mobile data" off, and GPS accuracy seems to be almost as good without the support of mobile data. However, it is obviously unsatisfactory to have to switch mobile data off. The only permissions in the Manifest are "android.permission.ACCESS_FINE_LOCATION" (and "android.permission.WRITE_EXTERNAL_STORAGE"), so the app has no explicit permission to use internet data. The LocationManager code is ` criteria.setAccuracy(Criteria.ACCURACY_FINE); criteria.setSpeedRequired(false); criteria.setAltitudeRequired(false); criteria.setBearingRequired(true); criteria.setCostAllowed(false); criteria.setPowerRequirement(Criteria.NO_REQUIREMENT); bestProvider = lm.getBestProvider(criteria, true); if (bestProvider != null) { lm.requestLocationUpdates(bestProvider, gpsMinTime, gpsMinDistance, this); ` The reference for LocationManager.getBestProvider says If no provider meets the criteria, the criteria are loosened ... Note that the requirement on monetary cost is not removed in this process. However, despite setting setCostAllowed to false the app still incurs a potential monetary cost. What else can I do to prevent the app from using mobile data?

    Read the article

  • Using XPath on String in Android (JAVA)

    - by Rav
    I am looking for some examples of using xpath in Android? Or if anyone can share their experiences. I have been struggeling to make tail or head of this problem :-( I have a string that contains a standard xml file. I believe I need to convert that into an xml document. I have found this code which I think will do the trick: public static Document stringToDom(String xmlSource) throws SAXException, ParserConfigurationException, IOException { DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance(); DocumentBuilder builder = factory.newDocumentBuilder(); return builder.parse(new InputSource(new StringReader(xmlSource))); } Next steps Assuming the code above is OK, I need to apply xpath to get values from cat: "/animal/mammal/feline/cat" I look at the dev doc here: http://developer.android.com/reference/javax/xml/xpath/XPath.html and also look online, but I am not sure where to start! I have tried to use the following code: XPathFactory xPathFactory = XPathFactory.newInstance(); // To get an instance of the XPathFactory object itself. XPath xPath = xPathFactory.newXPath(); // Create an instance of XPath from the factory class. String expression = "SomeXPathExpression"; XPathExpression xPathExpression = xPath.compile(expression); // Compile the expression to get a XPathExpression object. Object result = xPathExpression.evaluate(xmlDocument); // Evaluate the expression against the XML Document to get the result. But I get "Cannot be resolved". Eclipse doesn't seem to be able to fix this import. I tried manually entering: javax.xml.xpath.XPath But this did not work. Does anyone know any good source code that I can utilise, for Android platform? 1.5

    Read the article

  • Android, phone call audio stream via wlan

    - by moppel
    I am planning on developing my specific voip app for android. Here's the scenario: when a phone call occurs I want to hear the person who's calling on my local pc speakers and I want to speak to him via my own pc microphone / headset. So I need to send the audio stream of both me and the person I am talking to via the wlan network. Something like this: ... onCallStateChanged(int state, String phoneNumber){ while(state == PhoneListener.CALL_STATE_OFFHOOK){ //while phone call is happaning //send incoming speech via wlan to pc //receive audiostream from pc microphone and direct it to the phone call } } ... Is this possible with the current Android API? (Actually it should be since voip apps are available in the market) I did some research in the Android API and all I found was the AudioManager which has constant named public static final int STREAM_VOICE_CALL; //The audio stream for phone calls But I don't know how to use it our how it should give me access to the actual audiostreams which I can send via network. How do I manage to do this? The connection would be realised by TCP sockets.

    Read the article

  • Getting auth token for dropbox account from accountmanager in android

    - by user1490880
    I am trying to get auth token for a dropbox account configured in device from account manager. I am using accountManager.getAuthToken(account, "DROPBOX",null,Hello.this, new GetAuthTokenCallback(), null);//account" is dropbox account I am seeing a Allow/Deny page. I click on Allow, but the callback is not getting invoked at all and i dont get the auth token. I got the authtoken for a google account with this(with a different authtokentype). What i am missing. I am not sure about the authTokenType parameter for dropbox. Also are there any other parameter specific for dropbox like the bundle parameter that i am missing. Is this way possible for dropbox? Check below for the function parameters public AccountManagerFuture<Bundle> getAuthToken (Account account, String authTokenType, Bundle options, Activity activity, AccountManagerCallback<Bundle> callback, Handler handler) Link: http://developer.android.com/reference/android/accounts/AccountManager.html UPDATE I assume since we are able to create a dropbox account in android Accounts and Sync(Settings), there must be a dropbox authenticator that has all the functions in AbstractAccountAuthenticator implemented including getAuthToken(). So dropbox should support giving auth token i think. Also dropbox uses oauth1, whereas account manager uses outh 2.0. So is this an issue.Can anyone comment on this?

    Read the article

  • Eclipse Juno won't create Android Activity

    - by MikkoP
    I downloaded Eclipse Juno Java EE edition and installed ADT plugin. I created a new Android Application Project from Eclipse and in the wizard I created an activity called TaskariActivity. After I pressed finish, it created the project but not the activity and the wizard didn't close. I pressed cancel. No activity or anything in the src folder. I created a new activity by right clicking on src, selecting new -> other -> Android -> activity. I selected BlankActivity (as earlier when I was creating the project), selected the earlier created project in the Project combo box. I set the Activity name as TaskariActivity, layout name as activity_main and title as TaskariActivity. Next I selected navigation type and I set it to Tabs + Swipe (I thought this would make me tabs and everything I had to do would be just inserting the elements and the actions for them). I pressed next, nothing happened. Finish didn't also do anything. I pressed cancel and an activity wasn't created. So, how can I create an Android application like in the earlier version of Eclipse? It automatically created the activity and it was ready be run out of the box. Now it won't generate any files and the new activity wizard doesn't work. Help?

    Read the article

  • Using the hardware keyboard to simulate button press on Android

    - by Bevor
    Hello, it is difficult to test a game with the mouse pointer on android buttons. I would like to control those buttons with the hardware keyboard. Actually I don't want to control the buttons itself but I want to control the behaviour the buttons would also do. For example I have 4 buttons in the android application with "arrow up, down, left, right". I'd like to use the arrow buttons of my hardware keyboard to control the same. How can I do that? Actually the question is, where can I set the Listener? I tried something in my activity. I set this listener to the application button: button.setOnKeyListener(new OnKeyListener() { @Override public boolean onKey(View v, int keyCode, KeyEvent event) { if (keyCode == KeyEvent.KEYCODE_DPAD_DOWN) //scroll down return true; } }); The behaviour is the following: I can't scroll down with my hardware keyboard but with the hardware keyboard I can select the android buttons (they will be highlighted when I move on any button). After I selected the button with the Listener I can't select any other button anymore but then the Listener comes into force. Now I can scroll down with the hardware keyboard arrow down button. I would like to achieve this behaviour without selecting any button. So I thought about setting the listener to the layout container or any other layout but this has no effect. Is there any other approach to achieve this?

    Read the article

  • Developing Air (Flex) Applications for Android and Desktop

    - by Roaders
    I am an experienced Flex and Air Developer and love Android having owned a G1, a milestone (Droid), a Nexus One, a Galaxy S and now a Nexus S. Understandably I am interested in developing Flex applications for Android. I have just started working through the flex for android in 90 mins tutorial here: http://coenraets.org/flexandroid90/FlexAndroid90Minutes.pdf The very first step says that I have to create a Flex Mobile Project. I was under the impression that the whole point of Air is that the same application could run on many different platforms. I was intending on creating an air app with different skins that could be swapped in and out depending on the platform it was running on. This seems to imply that I will have to compile my Air app once for desktop and once for mobile. This isn't the end of the world but it's not quite how I expected it to work. I suppose that if I am creating mobile specific skins then I may as well create a mobile specific app. Is it possible to create one Air app that will run on both mobile and desktop? Is this a good idea?

    Read the article

  • Android NDK import-module / code reuse

    - by Graeme
    Morning! I've created a small NDK project which allows dynamic serialisation of objects between Java and C++ through JNI. The logic works like this: Bean - JavaCInterface.Java - JavaCInterface.cpp - JavaCInterface.java - Bean The problem is I want to use this functionality in other projects. I separated out the test code from the project and created a "Tester" project. The tester project sends a Java object through to C++ which then echo's it back to the Java layer. I thought linking would be pretty simple - ("Simple" in terms of NDK/JNI is usually a day of frustration) I added the JNIBridge project as a source project and including the following lines to Android.mk: NDK_MODULE_PATH=.../JNIBridge/jni/" JNIBridge/jni/JavaCInterface/Android.mk: ... include $(BUILD_STATIC_LIBRARY) JNITester/jni/Android.mk: ... include $(BUILD_SHARED_LIBRARY) $(call import-module, JavaCInterface) This all works fine. The C++ files which rely on headers from JavaCInterface module work fine. Also the Java classes can happily use interfaces from JNIBridge project. All the linking is happy. Unfortunately JavaCInterface.java which contains the native method calls cannot see the JNI method located in the static library. (Logically they are in the same project but both are imported into the project where you wish to use them through the above mechanism). My current solutions are are follows. I'm hoping someone can suggest something that will preserve the modular nature of what I'm trying to achieve: My current solution would be to include the JavaCInterface cpp files in the calling project like so: LOCAL_SRC_FILES := FunctionTable.cpp $(PATH_TO_SHARED_PROJECT)/JavaCInterface.cpp But I'd rather not do this as it would lead to me needing to update each depending project if I changed the JavaCInterface architecture. I could create a new set of JNI method signatures in each local project which then link to the imported modules. Again, this binds the implementations too tightly.

    Read the article

  • Android Maven and Refresh Problem

    - by antonio Musella
    Hi all, i've a strange problem with maven and android I've 3 maven project and 2 normal java maven project divided in this manner : normal project : model project ... packaged as jar ... contains java Pojo Bean and Interface. Dao Project ... packaged as jar ... contains Db Logic - Depend on model Project Android Application Maven project ContentProvider ... packaged as apk ... contains ContentProviders only. Depends on Dao Project Editors ... packaged as apk ... contains only Editor, Depends on Dao project MainApp ... packaged as apk ... contains MyApp, Depends on DAO ... The Problem is that if i modify DAO Project , Then do a maven clean and maven install of all apk project, then run as Android Application within Eclipse, i don't see updated app on my Emulator. Nicely if i shut down my ubuntu workstation and restart it i can see The updated app on my Emulator. Do you know a solution for this issue ? thanks and regards

    Read the article

  • Android: Defined array resource not found ?!

    - by David
    Hi, i have a strange (?) error in my android application. I have defined some arrays in values/arrays.xml the following way: <?xml version="1.0" encoding="utf-8"?> <resources> <array name="perimeter"> <item>10 miles</item> <item>20 miles</item> <item>30 miles</item> </array> <array name="regvalues"> <item>1</item> <item>2</item> <item>3</item> </array> </resources> Now I want to use them in a ListPreference for a PreferenceActivity (defined by an xml file). So i set android:entries="@array/perimeter" android:entryValues="@array/regvalues" for this ListPreference. When I try to use this on my device the app crashes. (NullPointer in Dialog.close()) If I try to use the regvalues-items as Entries for the ListPreference i get a NullPointer in ArrayAdapter.createViewFromResource(int, View, ViewGroup, int) line: 355 So there seems to be sth wrong with the regvalues array. But what ?!? Eclipse shows me no errors at compile time. So everything in the xml-file and java-code is written correctly and there are no typos. Where is the problem?!?

    Read the article

< Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >