Search Results

Search found 13243 results on 530 pages for 'android camera'.

Page 266/530 | < Previous Page | 262 263 264 265 266 267 268 269 270 271 272 273  | Next Page >

  • Digital Blue Digital Movie Creator 3.0 driver

    - by user27977
    I'm having a complete 'mare of a time trying to use my schools Digital Blue cameras. We've got the model 3 ones, but can't find the driver disc and using the Windows Hardware Installation Wizard gets me no where! Can you help me to find the driver? When I've used it at my old school it had a piece of software called the Digital Movie Creator, which I've heard you can use to make stop-motion films, which is what I want to do! This is what it looks like http://www.amazon.co.uk/Digital-Movie-Creator-1GB-Card/dp/B000LP30LA/ref=sr_1_2?ie=UTF8&s=software&qid=1265928833&sr=1-2

    Read the article

  • Software to clean up photos of whiteboards and documents?

    - by Norman Ramsey
    I take a lot of photos of whiteboards, blackboards, and so on for teaching purposes (examples online through May 2010). I'm interested in cleaning them up for archival purposes, preferably using Linux. Commercial products ClearBoard and PhotoNote are priced a little aggressively for my purposes, plus my students would like to have this capability too. Does anyone know of any good, open source software for Converting photographs to images with just a few colors? Eliminating perspective distortion? Removing unwanted junk from around the edges of an image? or anything like that? I'm imagining that I start out with a picture of my whiteboard using red and black markers, and I end up with a three-color image using just white, red, and black. Or I photograph a laser-printed document and end up with a clean black-and-white image. I have tried standard tools that reduce the number of colors in an image, and they do a terrible job—probably because they are trying to reproduce the uneven illumination of the original image. Command-line Linux tools would be ideal.

    Read the article

  • ICS as guest in Ubuntu 12.04 host

    - by oshirowanen
    I have installed Android as per this guide as a guess os via VirtualBox: http://www.android-x86.org/documents/virtualboxhowto Using the following ISO: android-x86-4.0-RC1-eeepc.iso But I am unable to connect to the internet from within the android virtual machine. The host OS is Ubuntu 12.04 where the internet works fine. I have internet access via a usb wireless connection to the home router. All this is fine. If I install Ubuntu 12.04 as a guest, where the host is also Ubuntu 12.04. The guest os'es internet works fine out of the box. But for some reason, I can't get the above androids internet to work out of the box as the guest os. Anyone know what I am doing wrong?

    Read the article

  • How can I connect integrated webcam with virtualbox

    - by Mike Stumpf
    I am trying to use a Windows XP VM for VirtualBox on my Windows 8.1 laptop. I have tried the usual attaching USB device but I get an error saying "USB device is busy with previous request". My webcam is not active in any applications and this happens after a clean reboot of the host, the guest, and VirtualBox. Here are the details: Host -HP Pavilion 17 Notebook PC (stock) -Windows 8.1 -AMD A10-5750M APU -HP Truevision HD (integrated webcam) VM I got the VM here: http://www.modern.ie/en-us/virtualization-tools VirtualBox -VirtualBox 4.3.12 installed -VirtualBox Extension pack installed -Guest additions are installed for 4.3.12 -Enable USB Controller is checked -It does not matter if enable 2.0 controller is checked or not -It does not matter if a USB device filter is set up for the webcam or not -Here is the error message: Failed to attach the USB device DDFEQ01G45BFBV HP Truevision HD [0004] to the virtual machine IE8 - WinXP. USB device 'DDFEQ01G45BFBV HP Truevision HD' with UUID {7a2e2a45-974d-482b-9b4e-9f9abbcd0ebb} is busy with a previous request. Please try again later. Result Code: E_INVALIDARG (0x80070057) Component: HostUSBDevice Interface: IHostUSBDevice {173b4b44-d268-4334-a00d-b6521c9a740a} Callee: IConsole {8ab7c520-2442-4b66-8d74-4ff1e195d2b6} I read on some VirtualBox forums that disabling USB 2.0 support in the host BIOS solved their issue but I wanted to know if there were any other ideas before I muck around in there. Thanks

    Read the article

  • rsync per-site configuration file?

    - by Scott
    I know how to configure a per-site entry for ssh, but is there any kind of a client configuration for rsync that allows per-site configuration options and aliases or similar shortcuts like the .ssh/config? I'm curious because I have a minimal ssh server installed on my android phone and I also have a minimal rsync tool on it as well. I'm getting tired of having to root login onto the phone and sym-link both tools to standard places the android OS looks for executables as the ssh server is bare bones and has a typical *bear multi-link binary for the basic unix commands (that does not include rsync) I end up having to include --rsync-path=/path/to/rsync/android/files/rsync every time I want to do any rsyncing of the files on my phone, but this path is always the same. I've gotten around it in the meantime with a glob approach in a shell script wrapper, but this sometimes limits the customization I can do with the rsync call. I'm just wondering if there is anything similar to the .ssh/config file where I can create an alias for my phone (e.g. 'android') where specifying rsync android:/mnt/sdcard will automatically assume --rsync-path=/blah/blah/blah --no-g --no-p --no-t etc. Tre`

    Read the article

  • How to control DVR connected cameras as IP Cameras

    - by Ajay
    We have analog IR cameras (not IP Cameras) and they are connected to DVR model no is (DVR Model: DVR H264, 16 Channel - ECOR264-16X1). We assigned a static IP to DVR and able to view all the cameras connected to it. Our requirement is to view individual cameras and recording option in remote location (the recorded data should store in remote) as like IP Cameras. Is it possible, if not, are there any DVR Model which can support like this.

    Read the article

  • USB connection is unstable with Nexus S 2.3.4 on AMD 64 running 64-bit Windows 7, but works with 32-bit Windows Vista

    - by Mike
    The USB connection is unstable with Nexus S (Android 2.3.4) on AMD 64 running 64-bit Windows 7, but it works with 32-bit Windows Vista. Problem Description: On the 64-bit Windows 7 machine my Nexus S appears to connect, but then it disconnects moments later. Neither accessing USB storage or loading an Android application package file (APK) using the Android Debug Bridge (ADB) work. On 32-bit Windows Vista using the same USB cable, USB storage works. I haven't tried the ADB on 32-bit Windows Vista. Reproduction steps for USB storage: (I have provided the reproduction steps for USB storage and not ADB, because if one isn't working, then the other isn't working either and the USB storage reproduction steps are shorter to document.) Connect the USB cable to the Nexus S and my Windows 7 machine. Effect: The "USB Mass Storage, USB Connected" dialog appears with the button "Turn on USB storage." Click "Turn on USB Storage" Effect: The "working circle" appears. A dialog briefly appears saying "USB storage in use," then it either returns me to Step 1 (now that I am running 2.3.4) or is replaced with the Nexus S's application homepage (while I was running 2.3.3). I'm not sure if the version matters, but I mention it for completeness. On the 32-bit Windows Vista machine the connection is stable. I am able to navigate through the Nexus S file system create, read, update, and delete files, etc. I haven't tried connecting with the ADB. Troubleshooting summary: Tried and failed: Uninstalling and reinstalling the Android USB drivers including removing the files. Uninstalling my custom software Pulling the Nexus S's battery Restarting the Nexus S Restarting 64-bit Windows 7 Changing USB ports on the 64-bit Windows 7 box Compared the dates and file size on the DLLs in my google-usb_driver\amd64 directory and the windows\System32 directory. They match. The sizes for the google-usb_driver\i386 directory do not match (expected). Turning off Debugging mode on the Nexus S does not resolve the problem. Searching Google. Tried and succeeded: Connecting to another machine (Windows Vista) using the same USB cable and Nexus S phone. Troubleshooting observations: I notice that uninstalling the device drivers and deleting the files, then reinstalling the drivers, then rebooting 64-bit Windows 7 then unplugging the Nexus S, then plugging it back in occasionally helps for a short amount of time (minutes to hours, not days). When it is working, I can both access the Nexus S's drive and load/test applications using the ADB. I have observed some wonky behavior in the Device Manager that I haven't tracked down. Sometimes the black Nexus S image appears in the list of devices. Sometimes the image displays as a computer with a green ISA card. Sometimes it neither appears on the top level of devices nor under “other devices,” but it does appear under "disk drives" as "Android UMS Composite USB Device." System configuration: The Nexus S is running Android OS 2.3.4's "Settings\about phone\System updates" indicates that it is up to date as of May 21st 2011. Both 32-bit Windows Vista and 64-bit Windows 7 are up to date. The Windows Vista system is running on an Intel 32-bit processor. Windows 7 is running on an AMD 64-bit processor. I have done Android development on both systems, but I usually develop on the 64-bit Windows 7 machine.

    Read the article

  • The project estimates the installation of external and internal surveillance. [closed]

    - by Zhasulan Berdybekov
    The project estimates the installation of external and internal surveillance. Here are our objects: 1 - Number of cameras 2 - These are objects 3 - setting this distance to the Situation Centre 11 - New Alphabet - 1,5 km 11 - New Alphabet - 1 km 19 - New Alphabet - 800 m 19 - New Alphabet - 1 km 35 - The building - 200 m 35 - The building - 100 m 18 - The building - 100 m 22 - Outside videonalyudenie - 50 m to 1 km Please tell how many need to DVRs, and where they put on the object or situation center How to bring information to the Situation Centre. What cables needed. Your advice and comment. Thank you for your efforts!

    Read the article

  • Uploading User Made Videos to iPhone (from Vista)

    - by Darren E.
    Once a user downloads a video created with the iPhone 3GS and then deletes it from the iPhone, that video cannot be uploaded back to the iPhone...according to Apple. The videos are not treated as Photos and are not allowed to sync to and from the iPhone freely. Has anyone discovered a program or tweek that allows one to upload video to the iPhone? Thanks.

    Read the article

  • Android media thumbnails. Serious issues?

    - by Ralphleon
    I've been playing with android's thumbnails for a while now, and I've seen some inconsistencies that make me want to scream. My goal is to have a simple list of all Images (and a separate list for video) with the thumbnail and filename. Device: HTC Evo (fresh from Google I/o) First off: http://androidsamples.blogspot.com/2009/06/how-to-display-thumbnails-of-images.html That code doesn't seem to work at all, thumbnails are duplicated... some with the "mirror" effect and some without. Also some won't load and just display a black square. I've tried rebuilding the thumbnails by deleting the "alblum thumbs" directory from the SD card. HTC's gallery application seem to show everything fine. This approach seems to work: Bitmap thumb = MediaStore.Images.Thumbnails.getThumbnail( getContentResolver(), id, MediaStore.Video.Thumbnails.MICRO_KIND, null); imageView.setImageBitmap(curThumb); where id is the original images id and imageView is some image view. This is great! But, strangely, way too slow to be used inside a SimpleViewBinder. Next approach: String [] proj = {MediaStore.Images.Thumbnails._ID}; Cursor c = managedQuery(MediaStore.Images.Thumbnails.EXTERNAL_CONTENT_URI, proj, MediaStore.Images.Thumbnails.IMAGE_ID + "=" +id , null, null); if (c != null && c.moveToFirst()) { Uri thumb = Uri.withAppendedPath(mThumbUri,c.getLong(0)+""); imageView.setImageURI(thumb); } I should explain that I feel the needed WHERE condition is required because there doesn't seem to be any guarantee that your uri will have the same ID for both a thumbnail and its parent image. This works for all of the current images, but as soon as I start adding pictures with the camera they show up as blank! Debugging shows a dreaded: SkImageDecoder::Factory returned null error and the URI is returned as invalid. These are the same images that work with the previous call. Can anyone either catch my logical failure or point me to some working code?

    Read the article

  • FolderClosed Exception in Javamail

    - by SikhWarrior
    Im trying to create a simple mail client in android, and I have the android version of javamail compiling and running in my app. However, whenever I try to connect and receive mail, I get a Folder Closed exception seen below. 10-23 12:12:13.484: W/System.err(6660): javax.mail.FolderClosedException 10-23 12:12:13.484: W/System.err(6660): at com.sun.mail.imap.IMAPMessage.getProtocol(IMAPMessage.java:149) 10-23 12:12:13.484: W/System.err(6660): at com.sun.mail.imap.IMAPMessage.loadBODYSTRUCTURE(IMAPMessage.java:1262) 10-23 12:12:13.484: W/System.err(6660): at com.sun.mail.imap.IMAPMessage.getDataHandler(IMAPMessage.java:616) 10-23 12:12:13.484: W/System.err(6660): at javax.mail.internet.MimeMessage.getContent(MimeMessage.java:1398) 10-23 12:12:13.484: W/System.err(6660): at com.teamzeta.sfu.Util.MailHelper.getMessageHTML(MailHelper.java:60) 10-23 12:12:13.484: W/System.err(6660): at com.teamzeta.sfu.GetAsyncEmails.onPostExecute(EmailActivity.java:31) 10-23 12:12:13.484: W/System.err(6660): at com.teamzeta.sfu.GetAsyncEmails.onPostExecute(EmailActivity.java:1) 10-23 12:12:13.484: W/System.err(6660): at android.os.AsyncTask.finish(AsyncTask.java:631) 10-23 12:12:13.484: W/System.err(6660): at android.os.AsyncTask.access$600(AsyncTask.java:177) 10-23 12:12:13.484: W/System.err(6660): at android.os.AsyncTask$InternalHandler.handleMessage(AsyncTask.java:644) 10-23 12:12:13.484: W/System.err(6660): at android.os.Handler.dispatchMessage(Handler.java:99) 10-23 12:12:13.484: W/System.err(6660): at android.os.Looper.loop(Looper.java:137) 10-23 12:12:13.484: W/System.err(6660): at android.app.ActivityThread.main(ActivityThread.java:5227) 10-23 12:12:13.484: W/System.err(6660): at java.lang.reflect.Method.invokeNative(Native Method) 10-23 12:12:13.484: W/System.err(6660): at java.lang.reflect.Method.invoke(Method.java:511) 10-23 12:12:13.484: W/System.err(6660): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:795) 10-23 12:12:13.484: W/System.err(6660): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:562) 10-23 12:12:13.494: W/System.err(6660): at dalvik.system.NativeStart.main(Native Method) My code is as follows: public static Message[] getAllMail(String user, String pwd){ String host = "imap.sfu.ca"; final Message[] NO_MESSAGES = {}; Properties properties = System.getProperties(); properties.setProperty("mail.imap.socketFactory.class", "javax.net.ssl.SSLSocketFactory"); properties.setProperty("mail.imap.socketFactory.port", "993"); Session session = Session.getDefaultInstance(properties); try { Store store = session.getStore("imap"); store.connect(host, user, pwd); Folder folder = store.getFolder("inbox"); folder.open(Folder.READ_ONLY); Message[] messages = folder.getMessages(); folder.close(true); store.close(); Log.d("####TEAM ZETA DEBUG####", "Content: " + messages.length); return messages; } catch (NoSuchProviderException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (MessagingException e) { // TODO Auto-generated catch block e.printStackTrace(); } Log.d("####TEAM ZETA DEBUG####", "Returning NO_MESSAGES"); return NO_MESSAGES; } public static String getMessageHTML(Message message){ Object msgContent; try { msgContent = message.getContent(); if (msgContent instanceof Multipart) { Multipart mp = (Multipart) msgContent; for (int i = 0; i < mp.getCount(); i++) { BodyPart bp = mp.getBodyPart(i); if (Pattern .compile(Pattern.quote("text/html"), Pattern.CASE_INSENSITIVE) .matcher(bp.getContentType()).find()) { // found html part return (String) bp.getContent(); } else { // some other bodypart... } } } } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (MessagingException e) { // TODO Auto-generated catch block e.printStackTrace(); } return "Something went wrong"; } I couldn't find anything helpful on the web, does anyone have an ideas why this is happening?? This is called in class GetAsyncEmails extends AsyncTask<String, Integer, Message[]>{ @Override protected Message[] doInBackground(String... args) { // TODO Auto-generated method stub Message[] messages = MailHelper.getAllMail(args[0], args[1]); return messages; } protected void onPostExecute(Message[] result){ if(result.length > 1){ Message message = result[0]; String content = MailHelper.getMessageHTML(message); System.out.println("####TEAM ZETA DEBUG####" + content); } } }

    Read the article

  • Implementing fog of war in opengl es 2.0 game

    - by joxnas
    Hi game development community, this is my first question here! ;) I'm developing a tactics/strategy real time android game and I've been wondering for some time what's the best way to implement an efficient and somewhat nice looking fog of war to incorporate in it. My experience with OpenGL or Android is not vast by any means, but I think it is sufficient for what I'm asking here. So far I have thought in some solutions: Draw white circles to a dark background, corresponding to the units visibility, then render to a texture, and then drawing a quad with that texture with blend mode set to multiply. Will this approach be efficient? Will it take too much memory? (I don't know how to render to texture and then use the texture. Is it too messy?) Have a grid object with a vertex shader which has an array of uniforms having the coordinates of all units, and another array which has their visibility range. The number of units will very probably never be bigger then 100. The vertex shader needs to test for each considered vertex, if there is some unit which can see it. In order to do this it, will have to loop the array with the coordinates and do some calculations based on distance. The efficiency of this is inversely proportional to the looks of it. A more dense grid will result in a more beautiful fog of war... but will require a greater amount of vertexes to be checked. Is it possible to find a nice compromise or is this a bad solution from the start? Which solution is the best? Are there better alternatives? Which ones? Thank you for your time.

    Read the article

  • Google play game services and Facebook integration in one game

    - by Ineentho
    We are creating a cross platform game for iOS and Android. We have thought about how and with which services we should integrate achievements and scoreboards with. For the iOS part, we are pretty sure that this how we want to do, in order from when the user opens the app for the first time: Connect with Game Center (Should be automatic, the user shouldn't even notice?) We will also get the players nickname for public scoreboards here. Ask if the user wants to connect with Facebook so that we can compare the players highscores with their friends. We could add Google play game services there as well, but I don't feel like that adds anything to the experience for the end user. Now comes the tricky part: Android We thought that we could do just like for iOS, except that we replace Game Center with Google Play Game Services. However, unlike Game Center, Game Services will ask the user to log in to their Google+ account and allow us to access their account. So now, what we have is a double login, first with Google+ and then with Facebook. What will users think about that? Should we scrap Play Services entirely and just ask the user for a nickname within our app and user Facebook for achievements?

    Read the article

  • Resolution Independence in libGDX

    - by ashes999
    How do I make my libGDX game resolution/density independent? Is there a way to specify image sizes as "absolute" regardless of the underlying density? I'm making a very simple kids game; just a bunch of sprites displayed on-screen, and some text for menus (options menu primarily). What I want to know is: how do I make my sprites/fonts resolution independent? (I have wrapped them in my own classes to make things easier.) Since it's a simple kids game, I don't need to worry about the "playable area" of the game; I want to use as much of the screen space as possible. What I'm doing right now, which seems super incorrect, is to simply create images suitable for large resolutions, and then scale down (or rarely, up) to fit the screen size. This seems to work okay (in the desktop version), even with linear mapping on my textures, but the smaller resolutions look ugly. Also, this seems to fly in the face of Android's "device independent pixels" (DPs). Or maybe I'm missing something and libGDX already takes care of this somehow? What's the best way to tackle this? I found this link; is this a good way of solving the problem?: http://www.dandeliongamestudio.com/2011/09/12/android-fragmentation-density-independent-pixel-dip/ It mentions how to control the images, but it doesn't mention how to specify font/image sizes regardless of density.

    Read the article

  • Should I be using Lua for game logic on mobile devices?

    - by Rob Ashton
    As above really, I'm writing an android based game in my spare time (android because it's free and I've no real aspirations to do anything commercial). The game logic comes from a very typical component based model whereby entities exist and have components attached to them and messages are sent to and fro in order to make things happen. Obviously the layer for actually performing that is thin, and if I were to write an iPhone version of this app, I'd have to re-write the renderer and core driver (of this component based system) in Objective C. The entities are just flat files determining the names of the components to be added, and the components themselves are simple, single-purpose objects containing the logic for the entity. Now, if I write all the logic for those components in Java, then I'd have to re-write them on Objective C if I decided to do an iPhone port. As the bulk of the application logic is contained within these components, they would, in an ideal world, be written in some platform-agnostic language/script/DSL which could then just be loaded into the app on whatever platform. I've been led to believe however that this is not an ideal world though, and that Lua performance etc on mobile devices still isn't up to scratch, that the overhead is too much and that I'd run into troubles later if I went down that route? Is this actually the case? Obviously this is just a hypothetical question, I'm happy writing them all in Java as it's simple and easy get things off the ground, but say I actually enjoy making this game (unlikely, given how much I'm currently disliking having to deal with all those different mobile devices) and I wanted to make a commercially viable game - would I use Lua or would I just take the hit when it came to porting and just re-write all the code?

    Read the article

  • DriveSafe.ly Reads Your Text Messages Aloud

    - by ETC
    DriveSafe.ly, a free application for Android and BlackBerry phones, reads your text messages, emails, and caller ID notifications aloud so you can stay connected while keeping your eyes on the road. DriveSafe.ly is a feature packed application that reads your text messages, your emails, and the ID from your caller ID aloud. It’s not the only SMS-to-speech application out there but it sports the most featured including rocking a customizable auto-responder (so you can let people know you heard their message and will respond as soon as you’re off the road), the ability to customize the voice and the read-rate, how much information if given (the senders name or just the message or the senders name, subject, and message in the case of emails), and more. Upgrading to the $13.95 a year premium version allows voice-to-txt translation so you can respond verbally to your text messages and emails. Hit up the link below to read more and grab a copy for your Android or BlackBerry phone. DriveSafe.ly [via Addictive Tips] Latest Features How-To Geek ETC Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 DriveSafe.ly Reads Your Text Messages Aloud The Likability of Angry Birds [Infographic] Dim an Overly Bright Alarm Clock with a Binder Divider Preliminary List of Keyboard Shortcuts for Unity Now Available Bring a Touch of the Wild West to Your Desktop with the Rango Theme for Windows 7 Manage Your Favorite Social Accounts in Chrome and Iron with Seesmic

    Read the article

  • Dalvik + Java licensing question

    - by Andrew Bate
    This is a licensing question about the Dalvik and J2SE core libraries. In particular the license governing java.util.concurrent.SynchronousQueue. The license header of the class in the JDK source states that it is GPLv2 only (see grepcode). However, the same file in the Dalvik core libraries seems to be governed by the Apache 2 license only (see android source). How is this possible? I didn't think you could take GPLv2 source and re-license it as Apache 2. (It's obvious they did: a comment above the Java Doc even says "removed link to collections framework docs"!) I'm asking because I have a GPLv3 project and would like to include a derivative work of some source from the core libraries (either Dalvik or J2SE) but publish it under GPLv3. I thought I could do this with Apache 2, but not GPLv2. I know that the J2SE class source is itself derivative work from public domain source, but the changes from the original are substantial. (The original is available at gee.cs.oswego.edu if you are interested.) Therefore the android source really is just a copy of the J2SE source, but published under Apache 2 instead of GPLv2. Is Google really allowed to do this?

    Read the article

  • Connecting a Samsung Galaxy S3 in Ubuntu 13.04

    - by Squishy
    In 13.04, whenever I connect an Android device, one of three things happens: 1 . It mounts successfully (maybe once out of 3 attempts) 2 . It fails to mount with the following error message: Oops! Something went wrong. Unhandled error message: Unable to open MTP device 3 . This one occasionally happens: Unhandled error message: No such interface `org.gtk.vfs.Mount' on object at path /org/gtk/vfs/mount/1 Regardless of activity (even when successfully mounted) it will continuously spam the following error message: Unable to mount SAMSUNG_Android Unable to open MTP Device '[usb:003,00x]' where x seems to be an arbitrary number below 10 and continues counting up with each new error message until the device is unplugged. I've also just noticed that even if it mounts successfully, it unmounts after about 30 seconds and starts spamming the error message above. The Android device is unlocked, always on and fully charged. ADB seems to function normally. Any suggestions? Further info: this happens on both a stock Samsung S3 and an Xperia Arc S running a custom AOSP based ROM. I've also tried the steps outlined in this Stack Overflow answer, but the problem persists. UPDATE: After doing a dist-upgrade (May 8th 2013), the Xperia Arc S on AOSP ROM now mounts and behaves normally. The S3, however, still behaves as described above. UPDATE: After careful observation, ABD does not, in fact, behave normally. If the error message above appears while sending an app to the device, the attempt is aborted with an error message saying that the device is unavailable.

    Read the article

  • Regarding sprite design and resolution for tablets and phones

    - by Dimitris P.
    I am about to start working on a game for android devices, in my spare time, to get familiar with android development. I'm more interested in using the best practices possible than getting a quick result, and that is why I need some guidance regarding graphics. I think the game is going to be fully sprite based. Everything is going to be in .bmp form, or something similar, and my question is: Should I design the sprites in a small resolution (ie for phone screens) and scale them up to fit into larger screens (tablet screens), should I do it vice-versa or should I consider a completely different approach? Would designing a different set of sprites for each of the most used resolution settings be worth it or are there simpler solutions to the problem with fewer drawbacks than the ones I mentioned above? (If I follow the first approach, for example, the larger the screen the worse the graphics will get, since every pixel of the original drawing will cover several pixels on the screen). Is there a standard approach for dealing with this kind of problems? If you need me to be more detailed or more clear about something I mentioned (or forgot to) please don't hesitate to ask. Also, excuse me for any inaccurate use of the English language. Thank you in advance for your input.

    Read the article

  • Auto Save and Auto Load Game onto the Device's Storage Concept Question

    - by David Dimalanta
    I'm trying to make a simple app that will test the save and load state. Is it a good idea to make an app that has an auto save and load game feature only every time the newbies open the first app then continues it on the other day? I tried making a sprite that is moving, starting at the center. When I close and re-open the app, the sprite goes back to the center instead of the last coordinate where the sprite land on this part (i.e. at the top). The thing I want to know how the sequence of saving and loading goes like this: I open the app The starting sprite at the center. It displays a coordinate of the sprite plus number of times does the sprite move. I exit the app that automatically saves the game without notice. Finally, when I re-opened it, it automatically loads the game retaining the number of times the sprite move, coordinates, and the sprite's area landed. These steps above are similar, but not the sprite movement test app, to the sequence of saving and loading the game's level and record in Jewel Stackers for the Android app. And, by default, if there is no SD card in any tab or phone that runs on Android, does it automatically save/load onto the internal drive or the APK file itself? Is it also useful to use auto save and auto load feature for protecting and fetching informations (i.e. fastest time, last time where the sprite is located via coordinates, etc.)?

    Read the article

  • Getting MTP to work with a Galaxy tab 2 7.0?

    - by Wouter
    I'm trying to get MTP with the galaxy tab 2 7.0 working on my ubuntu installation. Such that I can access the files. I tried to do what is described here: http://www.omgubuntu.co.uk/2011/12/how-to-connect-your-android-ice-cream-sandwich-phone-to-ubuntu-for-file-access I however fail at executing one of the following commands mtp-detect | grep idVendor mtp-detect | grep idProduct This fails [20:42|0] $ mtp-detect | grep idVender Device 0 (VID=04e8 and PID=6860) is a Samsung GT-P7310/P7510/N7000/I9100/Galaxy Tab 7.7/10.1/S2/Nexus/Note. PTP_ERROR_IO: failed to open session, trying again after resetting USB interface LIBMTP libusb: Attempt to reset device LIBMTP PANIC: failed to open session on second attempt Unable to open raw device 0 [20:44|0] $ mtp-detect | grep idProduct Device 0 (VID=04e8 and PID=6860) is a Samsung GT-P7310/P7510/N7000/I9100/Galaxy Tab 7.7/10.1/S2/Nexus/Note. PTP_ERROR_IO: failed to open session, trying again after resetting USB interface LIBMTP libusb: Attempt to reset device LIBMTP PANIC: failed to open session on second attempt Unable to open raw device 0 Now my guess was was that the idVender is the same as the VID (04e8) and the idProduct is the same as PID (6860) Now I continued to work with those values and completed the tutorial. When finished I tried android-connect This returned fuse: bad mount point `/media/GalaxyTab': Transport endpoint is not connected Does anybody have a clue what to do? Also I want to note that when I connect my GalaxyTab 2 7.0 that I still get a pop-up of ubuntu that a device was connected. I also can still see the mapstructure, the problem however is is that all the folders have 0 bytes and do not have any subfolders. I can only see the folders in the root. ps. I also checked a similar question and tried what is described in this answer http://askubuntu.com/a/88630/27480

    Read the article

  • What Technology can Render Medium Scale 3d Environments in a Web-Browser

    - by JakeM
    I intend to make a web application that displays 3d environments that can be navigated by dragging(with a finger or mouse depending on the platform). The web app will render 3d environments of development sites including contours, water pipeline locations, buildings etc. I am trying to decide what technology/libraries to use that will create a web-app that will work on Android-Web-Browser, iOS-Safari, IE9, Safari, Firefox and Chrome. And also what technology will provide speed in development. I understand that this is 'asking for my cake and eating it too'/'asking for the moon' but I don't know all the technologies out there - so there may be advanced libraries that can render 3d environments across many web-browsers including the main smart phone ones and I dont know of them. The 3d rendering would not be highly detailed buildings or water with effects, but rather simple 3d representations of these objects. The environment would be navigable by dragging around and you could view the landscape in layers(view only contour lines, view only underground pipelines, view only sewerage pipes, etc.). Are there any 3d libraries for web-browsers out there? Is there a way to run OpenGL(or OpenGL ES) through a webbrowser? What technology would you use if you were making this kind of app/web app that should work on desktop Windows, Android, iOS and WindowsPhone? Is there any technology I have failed to mention that would be good for this kind of project? I am tending towards a Browser Driven Web App because I get that cross platform ability(where it even works on linux and MacOS by using compatible web-browsers). Also I know of CSS3 transforms that can create cubes that can rotate in 3d space(NOTE only works for WebKit browsers - so no IE :( ). But I don't know if CSS3 is robust enough to render whole 3d environments? Do you think it could? Maybe I could use HTML5 canvas's for this? Can Google maps create custom 3d maps?

    Read the article

< Previous Page | 262 263 264 265 266 267 268 269 270 271 272 273  | Next Page >