Search Results

Search found 31677 results on 1268 pages for 'android view'.

Page 262/1268 | < Previous Page | 258 259 260 261 262 263 264 265 266 267 268 269  | Next Page >

  • How can I view an R32G32B32 texture?

    - by bobobobo
    I have a texture with R32G32B32 floats. I create this texture in-program on D3D11, using DXGI_FORMAT_R32G32B32_FLOAT. Now I need to see the texture data for debug purposes, but it will not save to anything but dds, showing the error in debug output, "Can't find matching WIC format, please save this file to a DDS". So, I write it to DDS but I can't open it now! The DirectX texture tool says "An error occurred trying to open that file". I know the texture is working because I can read it in the GPU and the colors seem correct. How can I view an R32G32B32 texture in an image viewer?

    Read the article

  • Get aggregated view of data for entire website with Google Analytics

    - by crmpicco
    I have a website (www.ayrshireminis.com), which has three main sections under different directories, these are: /forum /galleries /contact I would like to have an aggregated view of the data for the whole website, but also for each section. What is the recommended approach for doing this? I believe I can create a web property that includes a profile for the entire website and duplicated filtered profiles, each section having an include filter. This is my gut instinct, but i'd like to know if there is another (better) way to do it? Maybe by having one account that includes a profile for the whole site and another profile with an include filter for the individual sections?

    Read the article

  • View/ViewModel Interaction - Bindings, Commands and Triggers

    It looks like I have a set of posts on ViewModel, aka MVVM, that have organically emerged into a series or story of sorts. Recently, I blogged about The Case for ViewModel, and another on View/ViewModel Association using Convention and Configuration, and a long while back now, I posted an Introduction to the ViewModel Pattern as I was myself picking up this pattern, which has since become the natural way for me to program client applications. This installment adds to this on-going series. I've...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • View/ViewModel Interaction - Bindings, Commands and Triggers

    It looks like I have a set of posts on ViewModel, aka MVVM, that have organically emerged into a series or story of sorts. Recently, I blogged about The Case for ViewModel, and another on View/ViewModel Association using Convention and Configuration, and a long while back now, I posted an Introduction to the ViewModel Pattern as I was myself picking up this pattern, which has since become the natural way for me to program client applications. This installment adds to this on-going series. I've...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • How customers view and interact with a company

    The Harvard Business Review article written by Rayport and Jaworski is aptly titled “Best Face Forward” because it sheds light on how customers view and interact with a company. In the past most business interaction between customers was performed in a face to face meeting where one party would present an item for sale and then the other would decide whether to purchase the item. In addition, if there was a problem with a purchased item then they would bring the item back to the person who sold the item for resolution. One of my earliest examples of witnessing this was when I was around 6 or 7 years old and I was allowed to spend the summer in Tennessee with my Grandparents. My Grandfather had just written a book about the local history of his town and was selling them to his friends and local bookstores. I still remember he offered to pay me a small commission for every book I helped him sell because I was carrying the books around for him. Every sale he made was face to face with his customers which allowed him to share his excitement for the book with everyone. In today’s modern world there is less and less human interaction as the use of computers and other technologies allow us to communicate within seconds even though both parties may be across the globe or just next door. That being said, customers view a company through multiple access points called faces that represent the ability to interact without actually seeing a human face. As a software engineer this is a good and a bad thing because direct human interaction and technology based interaction have both good and bad attributes based on the customer. How organizations coordinate business and IT functions, to provide quality service varies based on each individual business and the goals and directives put in place by its management. According to Rayport and Jaworski, the type of interaction used through a particular access point may lend itself to be people-dominate, machine-dominate, or a combination of both. The method by which a company communicates information through an access point is a strategic choice that relates costs and customer outcomes. To simplify this, the choice is based on what can give the customer the best experience interacting with the company when the cost of the interaction is also a factor. I personally see examples of this every day at work. The company website is machine-dominate with people updating and maintaining information, our groups department is people dominate because most of the customer interaction is done at the customers location and is backed up by machine based data sources, and our sales/member service department is a hybrid because employees work in tandem with machines in order for them to assist customers with signing up or any other issue they may have. The positive and negative aspects of human and machine interfaces are a key aspect in deciding which interface to use when allowing customers to access a company or a combination of the two. Rayport and Jaworski also used MIT professor Erik Brynjolfsson preliminary catalog of human and machine strengths. He stated that humans outperform machines in judgment, pattern recognition, exception processing, insight, and creativity. I have found this to be true based on the example of how sales and member service reps at my company handle a multitude of questions and various situations with a lot of unknown variables. A machine interface could never effectively be able to handle these scenarios because there are too many variables to consider and would not have the built-in logic to process each customer’s claims and needs. In addition, he also stated that machines outperform humans in collecting, storing, transmitting and routine processing. An example of this would be my employer’s website. Customers can simply go online and purchase a product without even talking to a sales or member services representative. The information is then stored in a database so that the customer can always go back and review there order, and access their selected services. A human, no matter how smart they are would never be able to keep track of hundreds of thousands of customers let alone know what they purchased or how much they paid. In today’s technology driven economy every company must offer their customers multiple methods of accessibly in order to survive. The more of an opportunity a company has to create a positive experience for their customers, in my opinion, they more likely the customer will return to that company again. I have noticed this with my personal shopping habits and experiences. References Rayport, J., & Jaworski, B. (2004). Best Face Forward. Harvard Business Review, 82(12), 47-58. Retrieved from Business Source Complete database.

    Read the article

  • Implementing top view physics using box2D

    - by humbleBee
    How can top view physics games be done in box2D? One idea I have is to set the linear velocity of an object manually or to alter the linear and angular damping as my object moves over different surfaces. For example if my object is over a wet surface it'll have less linear damping and if it is over rough surface it'll have more damping. And to see if my object has fallen over an edge I'll try to use an AABB and check if its still inside or manually see if object.x > boundary.x etc. Is there any better way?

    Read the article

  • WHMCS Fatal error: Out of memory while View Invoice PDF

    - by prakash
    I can log into WHMCS & can access everything I should be able to access, but if i try to click View PDF Invoice, the following error will occur, Fatal error: Out of memory (allocated 67633152) (tried to allocate 76 bytes) in /home/xxxx/public_html/whmcs/includes/classes/class.tcpdf.php on line 8419 I have already set the allocated Memory limit to 256MB, but the error still occurs. At that time of the error, the process memory is exceeding the allocation I set. I checked log file, and found the following errors: #2 /home/xxxxx/public_html/client/includes/classes/class.tcpdf.php(8453): TCPDF->Image('/home/xxxxx/...', 20, 25, 75, 17.5816023739, 'PNG', '', '', false, 300, '', false, 8) #3 /home/xxxxx/public_html/client/includes/classes/class.tcpdf.php(7881): TCPDF->ImagePngAlpha('/home/xxxxx/...', 20, 25, 337, 79, 75, 17.5816023739, 'PNG', '', '', false, 300, '', NULL) While I was investigating the issue above I also noticed the error condition pictured below:

    Read the article

  • Render a 3D scene in multiple windows - extended panoramic view

    - by teodron
    Is there any resource location on how to view a 3D scene from an application or a game on multiple windows or monitors? Each window should continue drawing from where the neighbouring one left off (in the end, the result should be a mosaic of the scene). My idea is to use a camera for each window and have a reference position and orientation for a meta-camera object that is used to correctly offset the other camera. Since there are quite some elements to consider (window specs, viewport properties, position-orientation of each render camera), what is the correct way to update the individual cameras considering the position and orientation of the central, meta-camera? I currently cannot make the cameras present the scene contiguously (and I am reluctant in working out the transformations without checking whether this is the actual way of doing things).

    Read the article

  • View Frustum Alternative

    - by Kuros
    I am working on a simulation project that requires me to have entities walking around in a 3D world. I have all that working, matrix transformations, etc. I'm at the point where I need what is essentially a view frustum, so I can give each entity a visible area. However, when looking over the calculations required to do it, it seems like a perspective frustum is only required to be able to project it onto a 2D screen. Is there another, easier to code solution, that would function better, such as an orthogonal perspective? Could I just define a shape mathematically and test wether the coordinates of the objects are inside or out? I am not really a 3D coder (and I am doing this all from scratch, not using an engine or anything), so I would like the simplest solution possible for my needs. Thank you!

    Read the article

  • Google Analytics Dashboard: week-by-week view

    - by Silver Dragon
    Setting up Google Analytics Dashboard allows webmasters to get a weekly progress report of marketing achievements & keep a finger on what's going on at web properties. However, by default, the dashboard always displays a day-by-day report, which isn't actionable in markets, where meaningful improvements happen on a week-by-week, or month-over-month basis. Is there any way the default view (and reports sent out via email) can be set to display week-level resolution, as opposed to day-level resolution? (ie, repro: analytics - site - Standard reports - audience - overview - right side of the window, click "weeK") Many thanks!

    Read the article

  • View the Replay of Fusion Applications - Financials Highlights Lunchtime Webinar

    - by Theresa Hickman
    If you missed the free one hour Webinar and training on Feb. 28 that I blogged about in a previous blog post, you can view the replay at your convenience. Fusion Applications – Financials Highlights URL: http://education.oracle.com/education/downloads/Fusion_Financials_LVCRecording/ If it asks you for a code, enter E1049. For those of you who pre-registered for this Webinar but due to the timezone difference could not attend, you should have also received an email from Oracle University with a link to the replay. Enjoy!  

    Read the article

  • Drag camera/view in a 3D world

    - by Dono
    I'm trying to make a Draggable view in a 3D world. Currently, I've made it using mouse position on the screen, but, when I move the distance traveled by my mouse is not equal to the distance traveled in the 3D world. So, I've tried to do that : Compute a ray from mouse position to 3D world. Calculate intersection with the ground. Check intersection difference old position <- new position. Translate camera with the difference. I've got a problem with this method: The ray is computed with the current camera's position I move the camera I compute the new ray with new camera position. The difference between old ray and new ray is now invalid. So, graphically my camera don't stop to move to previous/new position everytime. How can I do a draggable camera with another solution ? Thanks!

    Read the article

  • Execute a Managed bean from a JSF view in WEB-INF folder

    - by JonathanVila
    We are initiating in Spring + Primefaces development and the first problem we have encountered is about storing the xhtml pages into the WEB-INF folder. When we use a faces form in a view located inside the WEB-INF folder, then the commandButton does not execute the managed bean method. Our bean : In fact we think the problem is that with JSF , the pages are rendered using a link to the same page as the action of the form, so if the page is located in WEB-INF it is not public accessible. We know that having all our xhtml views in the web folder instead of WEB-INF actually solves the issue, but we would like to store that pages into WEB-INF. Thank you.

    Read the article

  • Make an agenda view google calendar entry display initially as if it had been clicked

    - by aslum
    So I've got a google calendar embedded in my web page. It's set to agenda view so when you click on an entry it expands and shows you more information on the entry. I'd like to be able to link to the page w/ the embedded calendar from elsewhere, and have a specific entry already expanded (as if it had been clicked). Is this even possible? I'm not really sure where to start. PS: I don't have enough rep on this SE to create tags... and there isn't already a tag for "google-calendar"...

    Read the article

  • Associating Palettes with View TopComponents

    - by Geertjan
    A side effect of changes in NetBeans Platform 7.1 in the NetBeans window system is that palettes can now be created for view TopComponents. Below, a TopComponent in the "explorer" position is selected, which results in the Palette opening and showing items that can be dragged into the TopComponent: And here, in the same application, a TopComponent in the "output" position is now selected, which causes the Palette to open, showing items that are relevant to the TopComponent: I copied code used in the above two TopComponents from this blog entry: https://blogs.oracle.com/geertjan/entry/simple_component_palette Related issue: http://netbeans.org/bugzilla/show_bug.cgi?id=209238

    Read the article

  • how to change dolphin's select region of a file/folder (in detail view mode)

    - by Daniel
    Since nautilus has removed the dual pane. So I turn to dolphin. I am wondering how to change the default select behavior in detail view mode? Now to select a file/folder, my cursor has to hover over that file/folder and then the left click will take effect, but I much prefer the nautilus way, that is as long as your cursor is in the same row of that file/folder, left click picks up that item. This I think is more robust for user selection, it is faster. Any ideas how to achieve this in dolphin?

    Read the article

  • Can AdSense crawler view pages that require cookies?

    - by moomoochoo
    Details I require users to agree to terms and conditions before they can view several pages on my site. Once they have agreed a cookie is set and they can proceed to the webpage. If a user somehow manages to end up on the webpage without a cookie they will not be able to access the page's content. My question(s) Is the AdSense crawler able to set the cookie and visit these pages? If yes, how will it know to agree to the TOS? Is there some way to allow it access to the pages even if it couldn't use cookies?

    Read the article

  • Miss Open World? View this Roadmap Presentation

    - by PeopleTools Strategy
    If you were unable to attend Oracle Open World in September, you missed out on some important PeopleSoft messages.  Don't despair!  You now have a chance to receive an update on PeopleSoft’s presence at Oracle OpenWorld 2013 and the key messages delivered there. You can view the “PeopleSoft Update and Roadmap” webcast found here on the Quest Users Group site.  (Note: this is available with a FREE subscriber account.  Anyone can sign up here at no cost. This webcast recording presents the significant adoption and momentum behind PeopleSoft 9.2.  Viewers will also learn about the new release model for continuously delivering new capabilities to PeopleSoft customers at a lower cost enabled by the new PeopleSoft Update Manager.  There are also compelling live demonstrations of the major investment areas for PeopleSoft including a new PeopleSoft user experience enabling mobile solutions as well as In-Memory PeopleSoft applications.

    Read the article

  • Why can't I view a specific websites?

    - by user79263
    I am a netwrok administrator at a small company (22 PC) and i have a problem, and i don't know which could be the cause. I can't access a specific website from the LAN. I have a server which has 2 NICs(one to ISP and one to LAN). I have installed on the server (Bind DNS server and DHCP server) Debian 6 Server. I want to mention that "nslookup mcel.co.mz", and "dig my_site" is not working perfectly, I can't access the site from inside the LAN. I can't send emails to person that are located in that domain. When I dig mcel.co.mz @8.8.8.8 the domain is resolved. Why can't I view a specific websites? www.mcel.co.mz Why can't send email to person that are on some domains? @mcel.co.mz I don't have any rule configured in Debian Server which wouldn't let me acces the site. I am the only one responsable with the IT department. Lowly,

    Read the article

  • Open in explorer view not working SOMETIMES !!

    - by H(at)Ni
    Hello, As weird as it seems to anyone who used it before, most of the time explorer view does not work until some steps to be followed, but in my case it was working and sometimes randomly not working ! After spending hours of troubleshooting and collecting logs, Network traces, Fiddler traces, etc. I reached the solution from the Network trace. Although it seems strange, it was sending a PROPFIND request to the root directory "/" which was actually deleted. So, I came up to this important article that states that you must have a root site collection in your SharePoint web application in order to keep it in a supported state. http://support.microsoft.com/kb/2590564 And actually that explained it and solved the strange behavior as well. Cheers,

    Read the article

  • Voice Recognition Connection problem

    - by user244190
    I,m trying to work through and test a Voice Recognition example based on the VoiceRecognition.java example at http://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/app/VoiceRecognition.html but when click on the button to create the activity, I get a dialog that says Connection problem. My Manifest file is using the Internet Permission, and I understand it passes the to the Google Servers. Do I need to do anything else to use this. Code below UPDATE 2: Thanks to Steve, I have been able to install the USB Driver and debug the app directly on my Droid. Here is the LogCat output from clicking on my mic button: 03-08 18:36:45.686: INFO/ActivityManager(1017): Starting activity: Intent { act=android.speech.action.RECOGNIZE_SPEECH cmp=com.google.android.voicesearch/.IntentApiActivity (has extras) } 03-08 18:36:45.686: WARN/ActivityManager(1017): Activity is launching as a new task, so cancelling activity result. 03-08 18:36:45.787: DEBUG/NetworkLocationProvider(1017): setMinTime: 120000 03-08 18:36:45.889: INFO/ActivityManager(1017): Displayed activity com.google.android.voicesearch/.IntentApiActivity: 135 ms (total 135 ms) 03-08 18:36:45.905: DEBUG/NetworkLocationProvider(1017): onCellLocationChanged [802,0,0,4192,3] 03-08 18:36:45.951: INFO/MicrophoneInputStream(1429): Starting voice recognition with audio source VOICE_RECOGNITION 03-08 18:36:45.998: DEBUG/AudioHardwareMot(990): Codec sampling rate already 16000 03-08 18:36:46.092: INFO/RecognitionService(1429): ssfe url=http://www.google.com/m/voice-search 03-08 18:36:46.092: WARN/RecognitionService(1429): required parameter 'calling_package' is missing in IntentAPI request 03-08 18:36:46.115: DEBUG/AudioHardwareMot(990): Codec sampling rate already 16000 03-08 18:36:46.131: WARN/InputManagerService(1017): Starting input on non-focused client com.android.internal.view.IInputMethodClient$Stub$Proxy@4487d240 (uid=10090 pid=3132) 03-08 18:36:46.131: WARN/IInputConnectionWrapper(3132): showStatusIcon on inactive InputConnection 03-08 18:36:46.248: WARN/MediaPlayer(1429): info/warning (1, 44) 03-08 18:36:46.334: DEBUG/dalvikvm(3206): GC freed 3682 objects / 369416 bytes in 293ms 03-08 18:36:46.358: WARN/MediaPlayer(1429): info/warning (1, 44) 03-08 18:36:46.412: WARN/MediaPlayer(1429): info/warning (1, 44) 03-08 18:36:46.444: WARN/MediaPlayer(1429): info/warning (1, 44) 03-08 18:36:46.475: WARN/MediaPlayer(1429): info/warning (1, 44) 03-08 18:36:46.506: WARN/MediaPlayer(1429): info/warning (1, 44) 03-08 18:36:46.514: INFO/MediaPlayer(1429): Info (1,44) 03-08 18:36:46.514: INFO/MediaPlayer(1429): Info (1,44) 03-08 18:36:46.514: INFO/MediaPlayer(1429): Info (1,44) 03-08 18:36:46.514: INFO/MediaPlayer(1429): Info (1,44) 03-08 18:36:46.514: INFO/MediaPlayer(1429): Info (1,44) 03-08 18:36:46.514: INFO/MediaPlayer(1429): Info (1,44) The line that concerns me is the warning of the missing parameter calling-package. UPDATE: Ok, I was able to replace my emulator image with one from HTC that appears to come with Google Voice Search, however now when I run from the emulator, i'm getting an Audio Problem message with Speak Again or Cancel buttons. It appears to make it back to the onActivityResult(), but the resultCode is 0. Here is the LogCat output: 03-07 20:21:25.396: INFO/ActivityManager(578): Starting activity: Intent { action=android.speech.action.RECOGNIZE_SPEECH comp={com.google.android.voicesearch/com.google.android.voicesearch.RecognitionActivity} (has extras) } 03-07 20:21:25.406: WARN/ActivityManager(578): Activity is launching as a new task, so cancelling activity result. 03-07 20:21:25.968: WARN/ActivityManager(578): Activity pause timeout for HistoryRecord{434f7850 {com.ikonicsoft.mileagegenie/com.ikonicsoft.mileagegenie.MileageGenie}} 03-07 20:21:26.206: WARN/AudioHardwareInterface(554): getInputBufferSize bad sampling rate: 16000 03-07 20:21:26.256: ERROR/AudioRecord(819): Recording parameters are not supported: sampleRate 16000, channelCount 1, format 1 03-07 20:21:26.696: INFO/ActivityManager(578): Displayed activity com.google.android.voicesearch/.RecognitionActivity: 1295 ms 03-07 20:21:29.890: DEBUG/dalvikvm(806): threadid=3: still suspended after undo (s=1 d=1) 03-07 20:21:29.896: INFO/dalvikvm(806): Uncaught exception thrown by finalizer (will be discarded): 03-07 20:21:29.896: INFO/dalvikvm(806): Ljava/lang/IllegalStateException;: Finalizing cursor android.database.sqlite.SQLiteCursor@435d3c50 on ml_trackdata that has not been deactivated or closed 03-07 20:21:29.896: INFO/dalvikvm(806): at android.database.sqlite.SQLiteCursor.finalize(SQLiteCursor.java:596) 03-07 20:21:29.896: INFO/dalvikvm(806): at dalvik.system.NativeStart.run(Native Method) 03-07 20:21:31.468: DEBUG/dalvikvm(806): threadid=5: still suspended after undo (s=1 d=1) 03-07 20:21:32.436: WARN/IInputConnectionWrapper(806): showStatusIcon on inactive InputConnection I,m still not sure why I,m getting the Connect problem on the Droid. I can use Voice Search ok. I also tried clearing the cache, and data as described in some posts, butstill not working?? /** * Fire an intent to start the speech recognition activity. */ private void startVoiceRecognitionActivity() { Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH); intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, RecognizerIntent.LANGUAGE_MODEL_FREE_FORM); intent.putExtra(RecognizerIntent.EXTRA_PROMPT, "Speech recognition demo"); startActivityForResult(intent, VOICE_RECOGNITION_REQUEST_CODE); } /** * Handle the results from the recognition activity. */ @Override protected void onActivityResult(int requestCode, int resultCode, Intent data) { if (requestCode == VOICE_RECOGNITION_REQUEST_CODE && resultCode == RESULT_OK) { // Fill the list view with the strings the recognizer thought it could have heard ArrayList<String> matches = data.getStringArrayListExtra( RecognizerIntent.EXTRA_RESULTS); mList.setAdapter(new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1, matches)); } super.onActivityResult(requestCode, resultCode, data); }

    Read the article

  • creating a Menu from SQLite values in Java

    - by shanahobo86
    I am trying to create a ListMenu using data from an SQLite database to define the name of each MenuItem. So in a class called menu.java I have defined the array String classes [] = {}; which should hold each menu item name. In a DBAdapter class I created a function so the user can insert info to a table (This all works fine btw). public long insertContact(String name, String code, String location, String comments, int days, int start, int end, String type) { ContentValues initialValues = new ContentValues(); initialValues.put(KEY_NAME, name); initialValues.put(KEY_CODE, code); initialValues.put(KEY_LOCATION, location); initialValues.put(KEY_COMMENTS, comments); initialValues.put(KEY_DAYS, days); initialValues.put(KEY_START, start); initialValues.put(KEY_END, end); initialValues.put(KEY_TYPE, type); return db.insert(DATABASE_TABLE, null, initialValues); } It would be the Strings inserted into KEY_NAME that I need to populate that String array with. Does anyone know if this is possible? Thanks so much for the help guys. If I implement that function by Sam/Mango the program crashes, am I using it incorrectly or is the error due to the unknown size of the array? DBAdapter db = new DBAdapter(this); String classes [] = db.getClasses(); edit: I should mention that if I manually define the array: String classes [] = {"test1", "test2", "test3", etc}; It works fine. The error is a NullPointerException Here's the logcat (sorry about the formatting). I hadn't initialized with db = helper.getReadableDatabase(); in the getClasses() function but unfortunately it didn't fix the problem. 11-11 22:53:39.117: D/dalvikvm(17856): Late-enabling CheckJNI 11-11 22:53:39.297: D/TextLayoutCache(17856): Using debug level: 0 - Debug Enabled: 0 11-11 22:53:39.337: D/libEGL(17856): loaded /system/lib/egl/libGLES_android.so 11-11 22:53:39.337: D/libEGL(17856): loaded /system/lib/egl/libEGL_adreno200.so 11-11 22:53:39.357: D/libEGL(17856): loaded /system/lib/egl/libGLESv1_CM_adreno200.so 11-11 22:53:39.357: D/libEGL(17856): loaded /system/lib/egl/libGLESv2_adreno200.so 11-11 22:53:39.387: I/Adreno200-EGLSUB(17856): <ConfigWindowMatch:2078>: Format RGBA_8888. 11-11 22:53:39.407: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x5c66d000 size:36593664 offset:32825344 fd:65 11-11 22:53:39.417: E/(17856): Can't open file for reading 11-11 22:53:39.417: E/(17856): Can't open file for reading 11-11 22:53:39.417: D/OpenGLRenderer(17856): Enabling debug mode 0 11-11 22:53:39.477: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x5ecd3000 size:40361984 offset:36593664 fd:68 11-11 22:53:40.507: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x61451000 size:7254016 offset:3485696 fd:71 11-11 22:53:41.077: I/Adreno200-EGLSUB(17856): <ConfigWindowMatch:2078>: Format RGBA_8888. 11-11 22:53:41.077: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x61c4c000 size:7725056 offset:7254016 fd:74 11-11 22:53:41.097: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x623aa000 size:8196096 offset:7725056 fd:80 11-11 22:53:41.937: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x62b7b000 size:8667136 offset:8196096 fd:83 11-11 22:53:41.977: D/memalloc(17856): /dev/pmem: Unmapping buffer base:0x61c4c000 size:7725056 offset:7254016 11-11 22:53:41.977: D/memalloc(17856): /dev/pmem: Unmapping buffer base:0x623aa000 size:8196096 offset:7725056 11-11 22:53:41.977: D/memalloc(17856): /dev/pmem: Unmapping buffer base:0x62b7b000 size:8667136 offset:8196096 11-11 22:53:42.167: I/Adreno200-EGLSUB(17856): <ConfigWindowMatch:2078>: Format RGBA_8888. 11-11 22:53:42.177: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x61c5d000 size:17084416 offset:13316096 fd:74 11-11 22:53:42.317: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x63853000 size:20852736 offset:17084416 fd:80 11-11 22:53:42.357: D/OpenGLRenderer(17856): Flushing caches (mode 0) 11-11 22:53:42.357: D/memalloc(17856): /dev/pmem: Unmapping buffer base:0x5c66d000 size:36593664 offset:32825344 11-11 22:53:42.357: D/memalloc(17856): /dev/pmem: Unmapping buffer base:0x5ecd3000 size:40361984 offset:36593664 11-11 22:53:42.367: D/memalloc(17856): /dev/pmem: Unmapping buffer base:0x61451000 size:7254016 offset:3485696 11-11 22:53:42.757: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x5c56d000 size:24621056 offset:20852736 fd:65 11-11 22:53:44.247: D/AndroidRuntime(17856): Shutting down VM 11-11 22:53:44.247: W/dalvikvm(17856): threadid=1: thread exiting with uncaught exception (group=0x40ac3210) 11-11 22:53:44.257: E/AndroidRuntime(17856): FATAL EXCEPTION: main 11-11 22:53:44.257: E/AndroidRuntime(17856): java.lang.RuntimeException: Unable to instantiate activity ComponentInfo{niall.shannon.timetable/niall.shannon.timetable.menu}: java.lang.NullPointerException 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:1891) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:1992) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.ActivityThread.access$600(ActivityThread.java:127) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1158) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.os.Handler.dispatchMessage(Handler.java:99) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.os.Looper.loop(Looper.java:137) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.ActivityThread.main(ActivityThread.java:4441) 11-11 22:53:44.257: E/AndroidRuntime(17856): at java.lang.reflect.Method.invokeNative(Native Method) 11-11 22:53:44.257: E/AndroidRuntime(17856): at java.lang.reflect.Method.invoke(Method.java:511) 11-11 22:53:44.257: E/AndroidRuntime(17856): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:823) 11-11 22:53:44.257: E/AndroidRuntime(17856): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:590) 11-11 22:53:44.257: E/AndroidRuntime(17856): at dalvik.system.NativeStart.main(Native Method) 11-11 22:53:44.257: E/AndroidRuntime(17856): Caused by: java.lang.NullPointerException 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.content.ContextWrapper.openOrCreateDatabase(ContextWrapper.java:221) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.database.sqlite.SQLiteOpenHelper.getWritableDatabase(SQLiteOpenHelper.java:157) 11-11 22:53:44.257: E/AndroidRuntime(17856): at niall.shannon.timetable.DBAdapter.getClasses(DBAdapter.java:151) 11-11 22:53:44.257: E/AndroidRuntime(17856): at niall.shannon.timetable.menu.<init>(menu.java:15) 11-11 22:53:44.257: E/AndroidRuntime(17856): at java.lang.Class.newInstanceImpl(Native Method) 11-11 22:53:44.257: E/AndroidRuntime(17856): at java.lang.Class.newInstance(Class.java:1319) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.Instrumentation.newActivity(Instrumentation.java:1023) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:1882) 11-11 22:53:44.257: E/AndroidRuntime(17856): ... 11 more 11-11 22:53:46.527: I/Process(17856): Sending signal. PID: 17856 SIG: 9

    Read the article

  • Why is the vSphere console view so slow?

    - by blade
    Hi, Why is the Console view on the vSphere client so slow? It's a real shame because it's a shame to have to establish an RDP session every time you work on one of the VMs because of the speed of the console (I saw a tool to right click and open an RDP session to a VM in vSphere Client/ESX but this was not reliable). The Workstation console view is very smooth so I'd expect the vSphere Client console view to be very smooth. Thanks

    Read the article

  • How to view .docx and .doc file on mac os x10.6.5

    - by Harri
    I want to view .docx file and .doc file in my mac os x10.6.5.When i open this type of file it shows only text in text editor.The docx file has some more images i didnt able to see. Is there any default application in my mac to view these two files.Are i want to download some application to view those files ?If yes means what are the application really need for these files ? Can anyone help me ? Thanks in advance...

    Read the article

  • How to connect to a PEAP GTC wifi network with Android 2.2 on a nexus one?

    - by Glen
    Hi, I recently updated my nexus one to 2.2. Now I can't connect to my uni's wifi. They use PEAP with GTC. I had it working fine on 2.1. Also it works fine on my Ubuntu laptop. I have entered my uni number (user name) in the identity box. I have entered my password in the password box. I have emailed the certificated that works on Ubuntu to my self and installed it on the nexus one. I have enabled secure credentials. What am I doing wrong? Thanks, Glen.

    Read the article

< Previous Page | 258 259 260 261 262 263 264 265 266 267 268 269  | Next Page >