Search Results

Search found 17625 results on 705 pages for 'techno log'.

Page 518/705 | < Previous Page | 514 515 516 517 518 519 520 521 522 523 524 525  | Next Page >

  • how to set layout_weight programmatically for alert dialog button?

    - by Are
    Hi, Iam planing to give create 3 buttons with layout_weight=1, not interested in custom dialog.So I have written below code.It is not working.Always yes button gives me null. Whats wrong in this code? AlertDialog dialog= new AlertDialog.Builder(this).create(); dialog.setIcon(R.drawable.alert_icon); dialog.setTitle("title"); dialog.setMessage("Message"); dialog.setButton(AlertDialog.BUTTON_POSITIVE,"Yes", new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface arg0, int arg1) { } }); Button yesButton = dialog.getButton(AlertDialog.BUTTON_POSITIVE); Log.w("Button",""+yesButton);//here getting null LinearLayout.LayoutParams layoutParams = new LinearLayout.LayoutParams(LayoutParams.FILL_PARENT, LayoutParams.WRAP_CONTENT, 1f); yesButton.setLayoutParams(layoutParams); dialog.show(); Regards, Android developer.

    Read the article

  • How to get the action argument of a wp-login.php request?

    - by Bruno De Barros
    I am trying to integrate my custom user system with Wordpress, and I have recently asked a question on how to redirect requests to wp-login.php to my own login/registration page, but as I was working on the pluggable functions, I realized that requests to wp-login.php can either be for login, registration, or log out. This is set in the action argument that's made in the request. What I am trying to figure out is how to get this action argument, so I can redirect the request to my custom pages. Is there any way of doing this? Thank you in advance.

    Read the article

  • How can I profile a subroutine without using modules?

    - by Zaid
    I'm tempted to relabel this question 'Look at this brick. What type of house does it belong to?' Here's the situation: I've effectively been asked to profile some subroutines having access to neither profilers (even Devel::DProf) nor Time::HiRes. The purpose of this exercise is to 'locate' bottlenecks. At the moment, I'm sprinkling print statements at the beginning and end of each sub that log entries and exits to file, along with the result of the time function. Not ideal, but it's the best I can go by given the circumstances. At the very least it'll allow me to see how many times each sub is called. The code is running under Unix. The closest thing I see to my need is perlfaq8, but that doesn't seem to help (I don't know how to make a syscall, and am wondering if it'll affect the code timing unpredictably). Not your typical everyday SO question...

    Read the article

  • Image loader cant load my live image url

    - by Bindhu
    In my application i need to load the images in list view, when using locale(ip ported url) then no problem all images are loading properly, But when using live url then the images are not loading, My image loader class: public class ImageLoader { MemoryCache memoryCache = new MemoryCache(); FileCache fileCache; private Map<ImageView, String> imageViews = Collections .synchronizedMap(new WeakHashMap<ImageView, String>()); ExecutorService executorService; public ImageLoader(Context context) { fileCache = new FileCache(context); executorService = Executors.newFixedThreadPool(5); } final int stub_id = R.drawable.appointeesample; public void DisplayImage(String url, ImageView imageView) { imageViews.put(imageView, url); Bitmap bitmap = memoryCache.get(url); if (bitmap != null) imageView.setImageBitmap(bitmap); else { Log.d("stub", "stub" + stub_id); queuePhoto(url, imageView); imageView.setImageResource(stub_id); } } private void queuePhoto(String url, ImageView imageView) { PhotoToLoad p = new PhotoToLoad(url, imageView); executorService.submit(new PhotosLoader(p)); } private Bitmap getBitmap(String url) { File f = fileCache.getFile(url); // from SD cache Bitmap b = decodeFile(f); if (b != null) return b; // from web try { Bitmap bitmap = null; URL imageUrl = new URL(url); HttpURLConnection conn = (HttpURLConnection) imageUrl .openConnection(); conn.setConnectTimeout(30000); conn.setReadTimeout(30000); conn.setInstanceFollowRedirects(true); InputStream is = conn.getInputStream(); BufferedInputStream bis = new BufferedInputStream(is, 81960); BitmapFactory.Options opts = new BitmapFactory.Options(); opts.inJustDecodeBounds = true; OutputStream os = new FileOutputStream(f); Utils.CopyStream(bis, os); os.close(); bitmap = decodeFile(f); Log.d("bitmap", "Bit map" + bitmap); return bitmap; } catch (Exception ex) { ex.printStackTrace(); return null; } } // decodes image and scales it to reduce memory consumption private Bitmap decodeFile(File f) { try { try { BitmapFactory.Options o = new BitmapFactory.Options(); o.inJustDecodeBounds = true; BitmapFactory.decodeStream(new FileInputStream(f), null, o); final int REQUIRED_SIZE = 200; int scale = 1; while (o.outWidth / scale / 2 >= REQUIRED_SIZE && o.outHeight / scale / 2 >= REQUIRED_SIZE) scale *= 2; BitmapFactory.Options o2 = new BitmapFactory.Options(); o2.inSampleSize = scale; return BitmapFactory.decodeStream(new FileInputStream(f), null, o2); } catch (FileNotFoundException e) { } finally { System.gc(); } return null; } catch (Exception e) { } return null; } // Task for the queue private class PhotoToLoad { public String url; public ImageView imageView; public PhotoToLoad(String u, ImageView i) { url = u; imageView = i; } } class PhotosLoader implements Runnable { PhotoToLoad photoToLoad; PhotosLoader(PhotoToLoad photoToLoad) { this.photoToLoad = photoToLoad; } @Override public void run() { if (imageViewReused(photoToLoad)) return; Bitmap bmp = getBitmap(photoToLoad.url); memoryCache.put(photoToLoad.url, bmp); if (imageViewReused(photoToLoad)) return; BitmapDisplayer bd = new BitmapDisplayer(bmp, photoToLoad); Activity a = (Activity) photoToLoad.imageView.getContext(); a.runOnUiThread(bd); } } boolean imageViewReused(PhotoToLoad photoToLoad) { String tag = imageViews.get(photoToLoad.imageView); if (tag == null || !tag.equals(photoToLoad.url)) return true; return false; } // Used to display bitmap in the UI thread class BitmapDisplayer implements Runnable { Bitmap bitmap; PhotoToLoad photoToLoad; public BitmapDisplayer(Bitmap b, PhotoToLoad p) { bitmap = b; photoToLoad = p; } public void run() { if (imageViewReused(photoToLoad)) return; if (bitmap != null) photoToLoad.imageView.setImageBitmap(bitmap); else photoToLoad.imageView.setImageResource(stub_id); } } public void clearCache() { memoryCache.clear(); fileCache.clear(); } My Live Image url for Example: https://goappointed.com/images_upload/3330Torana_Logo.JPG I have referred google but no solution is working, Thanks a lot in advance.

    Read the article

  • IE sending through multiple unrequested requests for parent URL

    - by andyjeffries
    We have a rather large/complex web site (so it's not easy to extract portions of it). The problem we're having is that IE6/7/8 (but 6 is the worst) are sending through multiple requests for the parent of a URL (which fails, a different point). As an example, if you visited (dummy domain name obviously): http://www.example.com/view/event/123/456/78 We'd get 8-20 requests for: http://www.example.com/view/event/123/456/ appear in the Apache log. We've tried: Disabling all plugins (including toolbars, Flash, etc) Disabling scripting (e.g. Javascript) And it still sends through at least one erroneous request. We've viewed the requests through ieHttpHeaders so the browser is aware of them (i.e. it's not prefetching by the network's proxy). Any ideas? What can else could cause this?

    Read the article

  • Data 'logger' turning off when phone goes to standbymode

    - by Marko Järvenpää
    I'm creating a datalogger which logs the gps data and the sensor data of the phone. I've just a strange problem. If the phone is not touched for few minutes it goes into standby mode (screen goes black), and that causes the logger to stop working. Actually the firewriting in the logger stops working. The GPS resumes fine after coming out of blackscreen, but when I check the log I created it only shows saved points for few minutes. Does anyone have idea what is causing this?

    Read the article

  • Both Flash and Alternate Displaying In Firefox, Nothing but Errors in IE (sIFR)

    - by Tom Davison
    Hi guys, I've just installed and setup sIFR on my Magento store. An example page can be seen here: http://www.mint-creative.co.uk/shop And you can see it loads the heading brilliantly, but still displays the alternate text-only headline. I've quadruple checked the .css files and everything's loading ok I think. Also, in IE it doesn't display at all and every .css and .js file has an error according to the error log in IE8. Any help on this would be great as it's an urgent project! Cheers.

    Read the article

  • How do I send the ckeditor config using jQuery service?

    - by syn4k
    My service is built and it is sending the config variable to my js file: [['SpellChecker','-','Undo','Redo','-','Bold','Italic','Underline','NumberedList','BulletedList']] The above is assigned to my javascript like so: var config = "<?= stripslashes($_REQUEST['config']) ?>"; I can alert out the config just fine: console.log(config) does send back the correct data... However, I get an error thrown in my console!: v is undefined [Break On This Error] var u=n.toolbox.toolbars,v=n.config.to...aximize','ShowBlocks','-','About']]; If I comment out the config: //CKEDITOR.config.toolbar = config; everything works fine but the configuration doesn't exist, of course...

    Read the article

  • SQL Profiles showing high activity

    - by Wong Chi
    I am running my application locally -- ie. No external traffic and very low number of queries, fully under my control. I see tons of 'Audit Login' and 'Audit Logout' events. What are these and where are they actually stored (ie. Where is this audit log)? Are these a hint of a problem with connections, because I have only a simple connection string within my app and thought that connections would remain active throughout the operation of my app (ie. a single login at launch, and then a single logout when terminating).

    Read the article

  • Why is hierarchyviewer not working for Samsung Galaxy TAB 7.0?

    - by FireAndIce
    I've used hierarachyviewer earlier, but on android emulator. It works absolutely fine when I use it on the emulator. However it does not work with Samsung Galaxy TAB 7.0, with Android 2.3.4. This is the log, that I get 11:04:22 E/hierarchyviewer: Unable to get view server version from device 303599 64881B00EC 11:04:22 E/hierarchyviewer: Unable to get view server protocol version from devi ce 30359964881B00EC 11:04:24 E/hierarchyviewer: Unable to debug device 30359964881B00EC 11:05:05 E/hierarchyviewer: Unable to get view server version from device 303599 64881B00EC 11:05:05 E/hierarchyviewer: Unable to get view server protocol version from devi ce 30359964881B00EC 11:05:07 E/hierarchyviewer: Unable to debug device 30359964881B00EC 11:09:38 E/hierarchyviewer: Unable to get view server version from device 303599 64881B00EC 11:09:38 E/hierarchyviewer: Unable to get view server protocol version from devi ce 30359964881B00EC 11:09:40 E/hierarchyviewer: Unable to debug device 30359964881B00EC I'm also not using hierarchyviewer in the debug mode, just running the application. Thanks.

    Read the article

  • newline in Rackspace Cron Job email

    - by senloe
    I'm running backups against multiple databases hosted at Rackspace. This is working fine. The problem I'm running into is with the results email. I'm using Response.Write to write a message to the web page which is used for logging and is also consumed by the results mail sent out by the job. The problem is I can't seem to get newlines to appear between log messages. The logfile stored on the server is correct, but only the first newline shows up in the email. The mail is in Plain Text format so I tried using "\n" and System.Environment.Newline and neither work. I also tried using <br/> with no luck. Does anybody have any ideas?

    Read the article

  • Can I use part of MD5 hash for data identification?

    - by sharptooth
    I use MD5 hash for identifying files with unknown origin. No attacker here, so I don't care that MD5 has been broken and one can intendedly generate collisions. My problem is I need to provide logging so that different problems are diagnosed easier. If I log every hash as a hex string that's too long, inconvenient and looks ugly, so I'd like to shorten the hash string. Now I know that just taking a small part of a GUID is a very bad idea - GUIDs are designed to be unique, but part of them are not. Is the same true for MD5 - can I take say first 4 bytes of MD5 and assume that I only get collision probability higher due to the reduced number of bytes compared to the original hash?

    Read the article

  • How to update a table in database using LINQ in F#?

    - by sudaly
    I have seen plenty of examples on how to query the database but nothing on how to update records. Below is the simple code that I wrote to retrieve a table, but can someone explain me how can I modify a field, say lastActiveDate, and update the table on the database Thank you, suday open System open Microsoft.FSharp.Linq let connString = "Server=localhost;Database=myDb;Trusted_Connection=True;" let db = new MyDb(connString) db.Log <- System.Console.Out let res = Query.query <@ seq { for users in db.userAccounts do yield users } @> |> List.ofSeq printfn "Totla users: %d" res.Length

    Read the article

  • Problem with session based login after moving relevent files to site root

    - by YsoL8
    Hello I have a site which I have been testing in a sub-folder of my clients site-root. I had no log in problems during testing, but then I moved the new site files from a sub-directory to the main site root, and now I'm losing my logged in state after almost every page refresh in secure areas. I am running a $_session based login system that refreshes the session id on every page load, with a comparison value stored in the MySQL database. Does anyone have suggestions for what could be causing this problem?

    Read the article

  • jQuery hover() event on div element within a button element

    - by jakeisonline
    I can't seem to get jQuery to notice the div within the following markup <button class="button submit positive right" id="omnisubmit" type="submit"> <div class="label">Submit</div> <div class="controller">&nbsp;</div> </button> And here is the jQuery I'm currently using: $("button#omnisubmit div.controller").hover(function () { console.log("Hover..."); }); However, jQuery doesn't seem to pick up when the mouse is hovering over that div, $("button#omnisubmit div.controller").hover( works correctly, of course. I have a feeling it's because putting divs inside buttons may not be standard HTML?

    Read the article

  • cron stops running when processing multiple items

    - by James
    Cron stops running (no visible error) after a few hours when I add more than 5 jobs in my cron tab. Each job runs every minute (polls a webpage for information which takes 1 second). I tried putting all of my php jobs in a shell script and called this shell script instead but the same problem occurs. Cron stops running, no error in the log file, no error email sent out either. Anyone encountered this before? Where/how canI debug this?

    Read the article

  • Who deleted my sql table rows?

    - by vikasde
    I have an SQL table, from which data is being deleted. Nobody knows how the data is being deleted. I added a trigger and know the time, however no jobs are running that would delete the data. I also added a trigger whenever rows are being deleted from this table. I am then inserting the deleted rows and the SYSTEM_USER to a log table, however that doesnt help much. Is there anything better I can do to know who and how the data gets deleted? Would it be possible to get the server id or something? Thanks for any advice. Sorry: I am using SQL Server 2000.

    Read the article

  • Android Read contents of a URL (content missing after in result)

    - by josnidhin
    I have the following code that reads the content of a url public static String DownloadText(String url){ StringBuffer result = new StringBuffer(); try{ URL jsonUrl = new URL(url); InputStreamReader isr = new InputStreamReader(jsonUrl.openStream()); BufferedReader in = new BufferedReader(isr); String inputLine; while ((inputLine = in.readLine()) != null){ result.append(inputLine); } }catch(Exception ex){ result = new StringBuffer("TIMEOUT"); Log.e(Util.AppName, ex.toString()); } in.close(); isr.close(); return result.toString(); } The problem is I am missing content after 4065 characters in the result returned. Can someone help me solve this problem. Note: The url I am trying to read contains a json response so everything is in one line I think thats why I am having some content missing.

    Read the article

  • Python - Memory Leak

    - by Dave
    I'm working on solving a memory leak in my Python application. Here's the thing - it really only appears to happen on Windows Server 2008 (not R2) but not earlier versions of Windows, and it also doesn't look like it's happening on Linux (although I haven't done nearly as much testing on Linux). To troubleshoot it, I set up debugging on the garbage collector: gc.set_debug(gc.DEBUG_UNCOLLECTABLE | gc.DEBUG_INSTANCES | gc.DEBUG_OBJECTS) Then, periodically, I log the contents of gc.garbage. Thing is, gc.garbage is always empty, yet my memory usage goes up and up and up. Very puzzling.

    Read the article

  • Search server 2008 express working with WSS 3.0 (error when crawl second web application (website) s

    - by tberube
    My search server 2008 express crawls 3 sharepoint servers and 1 windows file server. The file server and 2 other sharepoint server crawl all content and master and sub sites just fine. On the 3rd sharepoint server the default site crawl just fine...The second web application site (content database) crawls the top site and crawls all sub sites BUT The subsites get the below error in the crawl log. Deleted by the gatherer (The start address or content source that contained this item was deleted and hence this item was deleted.) I have checked the security rights (OK) I have checked the setting to crawl a SharePoint site (if wrong the top site would not work)... ???? Last part = I can crawl 3 other sharepoint sites (web applications/other content databases) on the same server...It seems to be just this one site...

    Read the article

  • Using locks inside a loop

    - by Xaqron
    // Member Variable private readonly object _syncLock = new object(); // Now inside a static method foreach (var lazyObject in plugins) { if ((string)lazyObject.Metadata["key"] = "something") { lock (_syncLock) { if (!lazyObject.IsValueCreated) lazyObject.value.DoSomething(); } return lazyObject.value; } } Here I need synchronized access per loop. There are many threads iterating this loop and based on the key they are looking for, a lazy instance is created and returned. lazyObject should not be created more that one time. Although Lazy class is for doing so and despite of the used lock, under high threading I have more than one instance created (I track this with a Interlocked.Increment on a volatile shared int and log it somewhere). The problem is I don't have access to definition of Lazy and MEF defines how the Lazy class create objects. My questions: 1) Why the lock doesn't work ? 2) Should I use an array of locks instead of one lock for performance improvement ?

    Read the article

  • How Proxy server works with tcp/http connections?

    - by Vivek
    Since I am a beginner in the world of internet/networking, I always mess up with these kinds of doubts in my head while programming ;) .. My doubts are, While working behind a proxy, how my requests and responses work? Means my request headers and data will first reach to Proxy server- then proxy server sends it(same headers and data) to corresponding server. And server responses to it with a response header and body to the proxy server-then proxy server sends it to my computer. Wright? While using websockets we are upgrading our http connection to tcp. At this time what is happening @ Proxy server? Does the proxyserver also upgrades its connection to plain TCP? After opening such TCP connections, does the proxy server able to track/log those socket messsages? And most importantly, Is the proxy server transparent or acting like an original server infront of a client? Thanks for any answers or helpful links in advance.

    Read the article

  • Implementing a server side push for a small number of clients

    - by Helper Method
    For an web application I am working on I have the following requirements: Clients need to be able to log in via a web brower. After logging in, they will be able to change configurations (normal request/response) will be able to receive alarms sent by the server (a server side push) Now, the question is how to implement the alarms. I first thought of using some long polling approach (Comet), but as the amount of clients will definitely belimited to 5-10, I'm now thinking to go with a simpler approach. What are the options I have? Would it be okay to just let the clients poll the server? Important aspects are: Alarms should be delivered in (nearly) real-time. Alarms must not get lost (a lost alarm could cause harm to real people).

    Read the article

  • [Ext.tree.TreePanel] getNodeById does not work

    - by Moon
    Hi, I have a ext treepanel with json. var tree = new Ext.tree.TreePanel({ renderTo:'tree-container', title: 'Category', height: 300, width: 400, useArrows:true, autoScroll:true, animate:true, enableDD:true, containerScroll: true, rootVisible: false, frame: true, root: { text: 'Category', draggable: false, id: '0' }, // auto create TreeLoader dataUrl: $("#web").val() + "/category/index/get-nodes", listeners: { 'checkchange': function(node, checked){ if(checked){ categoryManager.add(node.id); //node.getUI().addClass('complete'); }else{ categoryManager.remove(node.id); // node.getUI().removeClass('complete'); } } } }); dataUrl loads the following json code [{"text":"Code Snippet","id":"1","cls":"folder","checked":false,"children":[{"text":"PHP","id":"3","cls":"file","checked":false,"children":[]},{"text":"Javascript","id":"4","cls":"file","checked":false,"children":[]}]}] when I try to find a node by console.log( tree.getNodeByid(3) ), it shows that it is undefined. Do I have a problem with my code?

    Read the article

  • [WordPress 3.1.3] Sreen option is disabled when a plugin is activated

    - by RNorbe
    I'm pretty new to wordpress. I was assigned to create a custom plugin for one of our projects here. The plugin worked as expected and there is no problem activating/deactivating it. When I was exploring the admin panel I noticed that the screen option is off. I read from a blog somewhere that deactivating the plugin one by one to check which plugin has caused this. I did just this and found out that the custom plugin I created was the cause. My question is, is there way to check what have caused this? Some log file I can look into? There is no error message or warning when I activated the plugin and it is giving the output required. This is my first plugin, any advice will be helpful. Btw, this plugin will display a comment (most recent will be shown first) in a widget and there is prev/next navigation to go through the rest of the comments. Thanks, RNorbe

    Read the article

< Previous Page | 514 515 516 517 518 519 520 521 522 523 524 525  | Next Page >