Search Results

Search found 2328 results on 94 pages for 'callback'.

Page 26/94 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • JQuery UI Autocomplete Syntax

    - by djs22
    Can someone help me understand the following code? I found it here. It takes advantage of the JQuery UI Autocomplete with a remote source. I've commented the code as best I can and a more precise question follows it. $( "#city" ).autocomplete({ source: function( request, response ) { //request is an objet which contains the user input so far // response is a callback expecting an argument with the values to autocomplete with $.ajax({ url: "http://ws.geonames.org/searchJSON", //where is script located dataType: "jsonp", //type of data we send the script data: { //what data do we send the script featureClass: "P", style: "full", maxRows: 12, name_startsWith: request.term }, success: function( data ) { //CONFUSED! response( $.map( data.geonames, function( item ) { return { label: item.name+(item.adminName1 ? ","+item.adminName1:"")+","+item.countryName, value: item.name } } ) ); } }); } }); As you can see, I don't understand the use of the success function and the response callback. I know the success function literal is an AJAX option which is called when the AJAX query returns. In this case, it seems to encapsulate a call to the response callback? Which is defined where? I thought by definition of a callback, it should be called on its own? Thanks!

    Read the article

  • How to refactor this Javascript anonymous function?

    - by HeavyWave
    We have this anonymous function in our code, which is part of the jQuery's Ajax object parameters and which uses some variables from the function it is called from. this.invoke = function(method, data, callback, error, bare) { $.ajax({ success: function(res) { if (!callback) return; var result = ""; if (res != null && res.length != 0) var result = JSON2.parse(res); if (bare) { callback(result); return; } for (var property in result) { callback(result[property]); break; } } }); } I have omitted the extra code, but you get the idea. The code works perfectly fine, but it leaks 4 Kbs on each call in IE, so I want to refactor it to turn the anonymous function into a named one, like this.onSuccess = function(res) { .. }. The problem is that this function uses variables from this.invoke(..), so I cannot just take it outside of its body. How do I correctly refactor this code, so that it does not use anonymous functions and parent function variables?

    Read the article

  • XMLHttpRequest leak in javascript. please help.

    - by Raja
    Hi everyone, Below is my javascript code snippet. Its not running as expected, please help me with this. <script type="text/javascript"> function getCurrentLocation() { console.log("inside location"); navigator.geolocation.getCurrentPosition(function(position) { insert_coord(new google.maps.LatLng(position.coords.latitude,position.coords.longitude)); }); } function insert_coord(loc) { var request = new XMLHttpRequest(); request.open("POST","start.php",true); request.onreadystatechange = function() { callback(request); }; request.setRequestHeader("Content-Type","application/x-www-form-urlencoded"); request.send("lat=" + encodeURIComponent(loc.lat()) + "&lng=" + encodeURIComponent(loc.lng())); return request; } function callback(req) { console.log("inside callback"); if(req.readyState == 4) if(req.status == 200) { document.getElementById("scratch").innerHTML = "callback success"; window.setTimeout("getCurrentLocation()",5000); } } getCurrentLocation(); //called on body load </script> What i'm trying to achieve is to send my current location to the php page every 5 seconds or so. i can see few of the coordinates in my database but after sometime it gets weird. Firebug show very weird logs like simultaneous POST's at irregular intervals. Here's the firebug screenshot: IS there a leak in the program. please help.

    Read the article

  • Why does jQuery.ajax() add a parameter to the url?

    - by FK82
    Hi, I have a data fetching method that uses jQuery.ajax() to fetch xml files. /* */data: function() { /* debug */try { var url = arguments[0] ; var type = arguments[1] ; var scope = arguments[2] ; var callback = arguments[3] ; var self = this ; if(this.cache[url]) { callback(this.cache[url]) ; } else if(!this.cache[url]) { $.ajax({ type: "GET" , url: url , dataType: type , cache: false , success: function(data) { if(type == "text/xml") { var myJson = AUX.json ; var jsonString = myJson.build(data,scope,null) ; var jsonObject = $.parseJSON(jsonString) ; self.cache[url] = jsonObject ; callback(url) ; } else if(type == "json") { self.cache[url] = data ; callback(url) ; } } , error: function() { throw "Ajax call failed." ; } }) ; } /* debug */} catch(e) { /* debug */ alert("- caller: signTutor.data\n- " + e) ; /* debug */} } , My problem is: jQuery somehow adds a parameter (?_=1272708280072) to the url if there are escaped (hexadecimal notation) or unescaped utf-8 characters outside of the ASCII range -- i believe -- in the file name. It all works well if the file name does not contain characters in that range. Type is set to xml so there should not be a confusion of types. Headers of the xml files are also set adequately. I can see from the console that jQuery throws an error, but I'm not sure as to where the problem really is. Probably a problem with file name formatting, but I did not find any resources on the web as to AJAX file name specifications. Any ideas? Thanks for you help!

    Read the article

  • Jquery - custom countdown

    - by matthewsteiner
    So I found this countdown at http://davidwalsh.name/jquery-countdown-plugin, I altered it a little bit: jQuery.fn.countDown = function(settings,to) { settings = jQuery.extend({ duration: 1000, startNumber: $(this).text(), endNumber: 0, callBack: function() { } }, settings); return this.each(function() { //where do we start? if(!to && to != settings.endNumber) { to = settings.startNumber; } //set the countdown to the starting number $(this).text(to); //loopage $(this).animate({ 'fontSize': settings.endFontSize },settings.duration,'',function() { if(to > settings.endNumber + 1) { $(this).text(to - 1).countDown(settings,to - 1); } else { settings.callBack(this); } }); }); }; Then I have this code: $(document).ready(function(){ $('.countdown').countDown({ callBack: function(me){ $(me).text('THIS IS THE TEXT'); } }); }); I don't mind taking everything out of the "animate" loop; I'd prefer that since nothing needs to be animated. (I don't need the font size to change). So everything's working to a point. I have a span with class countdown and whatever is in it when the page is refreshed goes down second by second. However, I need it to be formatted in M:S format. So, my two questions: 1) What can I use instead of animate to take care of the loop yet maintain the callback 2) How (where in the code should I) can I play with the time format? Thanks.

    Read the article

  • Json HTTP Module stream issue

    - by Justin
    Hey, I have an HTTP Module that I use to clean up the JSON returned by my web service (see http://www.codeproject.com/KB/webservices/ASPNET_JSONP.aspx?msg=3400287#xx3400287xx for an example of this.) Basically it relates to calling cross-domain JSON web services from javascript. There is this JsonHttpModule which uses a JsonResponseFilter Stream class to write out the JSON and the overloaded Write method is supposed to wrap the name of the callback function around the JSON, otherwise the JSON errors out as needing a label. However, if the JSON is really long, the Write method in the Stream class is called multiple times, causing the callback function to incorrectly get inserted midway through the JSON. Is there a way in the Stream class to wrap the callback function around the stream at the end or to specify that it write all of the JSON in 1 Write method instead of in chunks?? Here's where it calls the JsonResponseFilter in the JsonHttpModule: public void OnReleaseRequestState(object sender, EventArgs e) { HttpApplication app = (HttpApplication)sender; if (!_Apply(app.Context.Request)) return; // apply response filter to conform to JSONP app.Context.Response.Filter = new JsonResponseFilter(app.Context.Response.Filter, app.Context); } Here's the Write method in the JsonResponseFilter Stream class that gets called multiple times: public override void Write(byte[] buffer, int offset, int count) { var b1 = Encoding.UTF8.GetBytes(_context.Request.Params["callback"] + "("); _responseStream.Write(b1, 0, b1.Length); _responseStream.Write(buffer, offset, count); var b2 = Encoding.UTF8.GetBytes(");"); _responseStream.Write(b2, 0, b2.Length); } Thanks for any help! Justin

    Read the article

  • pass object from JS to PHP and back

    - by Radu
    This is something that I don't think can't be done, or can't be done easy. Think of this, You have an button inside a div in HTML, when you click it, you call a php function via AJAX, I would like to send the element that start the click event(or any element as a parameter) to PHP and BACK to JS again, in a way like serialize() in PHP, to be able to restore the element in JS. Let me give you a simple example: PHP: function ajaxCall(element){ return element; } JS: callbackFunction(el){ el.color='red'; } HTML: <div id="id_div"> <input type="button" value="click Me" onClick="ajaxCall(this, callbackFunction);" /> </div> So I thing at 3 methods method 1. I can give each element in the page an ID. so the call to Ajax would look like this: ajaxCall(this.id, callbackFunction); and the callback function would be: document.getElementById(el).color='red'; This method I think is hard, beacause in a big page is hard to keep track of all ID's. method 2. I think that using xPath could be done, If i can get the exact path of an element, and in the callback function evaluate that path to reach the element. This method needs some googling, it is just an ideea. method 3. Modify my AJAX functions, so it retain the element that started the event, and pass it to the callback function as argument when something returns from PHP, so in my AJAX would look like this: eval(callbackFunction(argumentsFromPhp, element)); and the callback function would be: callbackFunction(someArgsFromPhp, el){ el.color='red'; // parse someArgsFromPhp } I think that the third option is my choise to start this experiment. Any of you has a better idea how I can accomplish this ? Thank you.

    Read the article

  • [PERL Tk] printing Line number in Text widget

    - by ungalnanban
    I use the following code for printing the line number in Text widget. my $c=0; my $r=0; $txt = $mw-Text( -background ='white', -width=>400, -height=>300, -selectbackground => 'skyblue', -insertwidth => 5, -borderwidth =>3, -highlightcolor => 'blue', ### after visit -highlightbackground => 'red' , ### default before visit -xscrollcommand => sub { print"CHAT NO :",$c++; }, # Determines the callback used when the Text widget is scrolled horizontally. -yscrollcommand = sub { print"LINR NO:",$r++; }, # Determines the callback used when the Text widget is scrolled vertically. -padx = 5, -pady = 5, )- pack (); the above code is printing the line number and character no is ok. but I used in Scrolled widget that output is not printing. what is the problem in the following code how can I solve this? $txt = $mw-Scrolled('Text', -scrollbars = 'se', -background ='white', -width=>400, -height=>300, -insertwidth => 5, -borderwidth =>3, -highlightcolor => 'blue', ### after visit -highlightbackground => 'red' , ### default before visit -padx => 5, -pady => 5, -xscrollcommand => sub { print"CHAT NO :",$c++; }, # Determines the callback used when the Text widget is scrolled horizontally. -yscrollcommand => sub { print"LINR NO :",$r++; }, # Determines the callback used when the Text widget is scrolled vertically. )->pack();

    Read the article

  • JNI Stream binary data from C++ to Java

    - by Cliff
    I need help passing binary data into Java. I'm trying to use jbytearray but when the data gets into Java it appears corrupt. Can somebody give me a hand? Here's a snip of some example code. First the native C++ side: printf("Building audio array copy\n"); jbyteArray rawAudioCopy = env-NewByteArray(10); jbyte toCopy[10]; printf("Filling audio array copy\n"); char theBytes[10] = {0,1,2,3,4,5,6,7,8,9}; for (int i = 0; i < sizeof(theBytes); i++) { toCopy[i] = theBytes[i]; } env->SetByteArrayRegion(rawAudioCopy,0,10,toCopy); printf("Finding object callback\n"); jmethodID aMethodId = env->GetMethodID(env->GetObjectClass(obj),"handleAudio","([B)V"); if(0==aMethodId) throw MyRuntimeException("Method not found error",99); printf("Invoking the callback\n"); env->CallVoidMethod(obj,aMethodId, &rawAudioCopy); and then the Java callback method: public void handleAudio(byte[] audio){ System.out.println("Audio supplied to Java [" + audio.length + "] bytes"); byte[] expectedAudio = {0,1,2,3,4,5,6,7,8,9}; for (int i = 0; i < audio.length; i++) { if(audio[i]!= expectedAudio[i]) System.err.println("Expected byte " + expectedAudio[i] + " at byte " + i + " but got byte " + audio[i]); else System.out.print('.'); } System.out.println("Audio passed back accordingly!"); } I get the following output when the callback is invoked: library loaded! Audio supplied to Java [-2019659176] bytes Audio passed back accordingly!

    Read the article

  • drupal hook_menu_alter9) for adding tabs

    - by EricP
    I want to add some tabs in the "node/%/edit" page from my module called "cssswitch". When I click "Rebuild Menus", the two new tabs are displayed, but they are displayed for ALL nodes when editing them, not just for the node "cssswitch". I want these new tabs to be displayed only when editing node of type "cssswitch". The other problem is when I clear all cache, the tabs completely dissapear from all edit pages. Below is the code I wrote. function cssswitch_menu_alter(&$items) { $node = menu_get_object(); //print_r($node); //echo $node->type; //exit(); if ($node->type == 'cssswitch') { $items['node/%/edit/schedulenew'] = array( 'title' => 'Schedule1', 'access callback'=>'user_access', 'access arguments'=>array('view cssswitch'), 'page callback' => 'cssswitch_schedule', 'page arguments' => array(1), 'type' => MENU_LOCAL_TASK, 'weight'=>4, ); $items['node/%/edit/schedulenew2'] = array( 'title' => 'Schedule2', 'access callback'=>'user_access', 'access arguments'=>array('view cssswitch'), 'page callback' => 'cssswitch_test2', 'page arguments' => array(1), 'type' => MENU_LOCAL_TASK, 'weight'=>3, ); } } function cssswitch_test(){ return 'test'; } function cssswitch_test2(){ return 'test2'; } Thanks for any help.

    Read the article

  • drupal hook_menu_alter() for adding tabs

    - by EricP
    I want to add some tabs in the "node/%/edit" page from my module called "cssswitch". When I click "Rebuild Menus", the two new tabs are displayed, but they are displayed for ALL nodes when editing them, not just for the node "cssswitch". I want these new tabs to be displayed only when editing node of type "cssswitch". The other problem is when I clear all cache, the tabs completely dissapear from all edit pages. Below is the code I wrote. function cssswitch_menu_alter(&$items) { $node = menu_get_object(); //print_r($node); //echo $node->type; //exit(); if ($node->type == 'cssswitch') { $items['node/%/edit/schedulenew'] = array( 'title' => 'Schedule1', 'access callback'=>'user_access', 'access arguments'=>array('view cssswitch'), 'page callback' => 'cssswitch_schedule', 'page arguments' => array(1), 'type' => MENU_LOCAL_TASK, 'weight'=>4, ); $items['node/%/edit/schedulenew2'] = array( 'title' => 'Schedule2', 'access callback'=>'user_access', 'access arguments'=>array('view cssswitch'), 'page callback' => 'cssswitch_test2', 'page arguments' => array(1), 'type' => MENU_LOCAL_TASK, 'weight'=>3, ); } } function cssswitch_test(){ return 'test'; } function cssswitch_test2(){ return 'test2'; } Thanks for any help.

    Read the article

  • Problems with Window Functions Wndproc and about

    - by BrianHuangverinem
    I really having problems with this problem ,it would be nice if someone help me on this. Every time I try to build my source file the same errors occur every time for the two window functions CALLBACK Wndproc and CALLBACK About. error: "local function definitions are illegal" Can you tell me what mistake I made? LRESULT CALLBACK WndProc(HWND hWnd, UINT message, WPARAM wParam, LPARAM lParam) { int wmId, wmEvent; PAINTSTRUCT ps; HDC hdc; wmId = LOWORD(wParam); wmEvent = HIWORD(wParam); switch (message) { case WM_COMMAND: // Parse the menu selections: switch (wmId) { case IDM_ABOUT: DialogBox(hInst, MAKEINTRESOURCE(IDD_ABOUTBOX), hWnd, About); break; case IDM_EXIT: DestroyWindow(hWnd); break; default: return DefWindowProc(hWnd, message, wParam, lParam); } break; case WM_PAINT: hdc = BeginPaint(hWnd, &ps); CaptureImage(hWnd); EndPaint(hWnd, &ps); break; case WM_DESTROY: PostQuitMessage(0); break; default: return DefWindowProc(hWnd, message, wParam, lParam); } return 0; } // Message handler for about box. INT_PTR CALLBACK About(HWND hDlg, UINT message, WPARAM wParam, LPARAM lParam) { UNREFERENCED_PARAMETER(lParam); switch (message) { case WM_INITDIALOG: return (INT_PTR)TRUE; case WM_COMMAND: if (LOWORD(wParam) == IDOK || LOWORD(wParam) == IDCANCEL) { EndDialog(hDlg, LOWORD(wParam)); return (INT_PTR)TRUE; } break; } return (INT_PTR)FALSE; }

    Read the article

  • simplemodal click events stopped working in IE7

    - by prettynerd
    Here's my code: $('#alertInfo').modal({ close :false, overlayId :'confirmModalOverlay', containerId :'confirmModalContainer', onShow : function(dialog) { dialog.data.find('.message').append(message); dialog.data.find('.yes').click(function(){ if ($.isFunction(callback)) callback.apply(); $.modal.close(); }); dialog.data.find('.close').click(function(){ $.modal.close(); }); } }); Basically, this is a dialogue box which I call to show a warning message that has a "X" button (with class 'close') and an "OK" button (with class 'yes'). The problem occurs in IE7. When I call this dialogue box and use my "X" button to close it everytime, my "X" button does not work anymore on the third time I call it (YES ON THE THIRD TIME!). However, if I use my "OK" button to close the dialogue box, it works fine no matter how many times I call it. I thought I found a workaround by unbinding and binding my click event of the '.close' class, as below: dialog.data.find('.close').unbind('click'); dialog.data.find('.close').bind('click',function(){$.modal.close();}); and it worked!!! unfortunately, however, the problem now occurs in my "OK" button. so, i did the same unbinding and binding the click event of the '.yes' class, as below: dialog.data.find('.yes').unbind('click'); dialog.data.find('.yes').bind('click', function() { if ($.isFunction(callback)) callback.apply(); $.modal.close(); }); BUT NOPE, IT DOES NOT WORK.. please help me.. @ericmmartin, i hope you're online now.. huhu..

    Read the article

  • Casting/dereferencing member variable pointer from void*, is this safe?

    - by Damien
    Hi all, I had a problem while hacking a bigger project so I made a simpel test case. If I'm not omitting something, my test code works fine, but maybe it works accidentally so I wanted to show it to you and ask if there are any pitfalls in this approach. I have an OutObj which has a member variable (pointer) InObj. InObj has a member function. I send the address of this member variable object (InObj) to a callback function as void*. The type of this object never changes so inside the callback I recast to its original type and call the aFunc member function in it. In this exampel it works as expected, but in the project I'm working on it doesn't. So I might be omitting something or maybe there is a pitfall here and this works accidentally. Any comments? Thanks a lot in advance. (The problem I have in my original code is that InObj.data is garbage). #include <stdio.h> class InObj { public: int data; InObj(int argData); void aFunc() { printf("Inside aFunc! data is: %d\n", data); }; }; InObj::InObj(int argData) { data = argData; } class OutObj { public: InObj* objPtr; OutObj(int data); ~OutObj(); }; OutObj::OutObj(int data) { objPtr = new InObj(data); } OutObj::~OutObj() { delete objPtr; } void callback(void* context) { ((InObj*)context)->aFunc(); } int main () { OutObj a(42); callback((void*)a.objPtr); }

    Read the article

  • jQuery accessing objects

    - by user1275268
    I'm trying to access the values of an object from a function I created with a callback, but have run into some trouble. I'm still fairly new at jQuery/javascript. I call the function as follows: siteDeps(id,function(data){ $.each(data,function(key,val) { console.log(key); console.log(val); }); }); The function runs 5 ajax queries from XML data and returns data as an multidimensional object; here is a excerpt showing the meat of it: function siteDeps(id,callback) { var result = { sitecontactid : {}, siteaddressid : {}, sitephoneid : {}, contactaddressid : {}, contactphoneid : {} }; ...//.... var url5 = decodeURIComponent("sql2xml.php?query=xxxxxxxxxxx"); $.get(url5, function(data){ $(data).find('ID').each(function(i){ result.delsitephoneid[i] = $(this).text(); }); }); callback(result); } The console.log output shows this: sitecontactid Object 0: "2" 1: "3" __proto__: Object siteaddressid Object 0: "1" __proto__: Object sitephoneid Object 0: "1" 1: "5" 2: "54" __proto__: Object contactaddressid Object 0: "80" __proto__: Object contactphoneid Object 0: "6" __proto__: Object How can I extract the callback data in a format I can use, for instance sitephoneid: "1","5","54" Or is there a better/simpler way to do this? Thanks in advance.

    Read the article

  • BING Search using ASP.NET and jQuery Ajax

    - by hajan
    The BING API provides extremely simple way to make search queries using BING. It provides nice way to get the search results as XML or JSON. In this blog post I will show one simple example on how to query BING and get the results as JSON in an ASP.NET website with help of jQuery’s getJSON ajax method. Basically we submit an HTTP GET request with the AppID which you can get in the BING Developer Center. To create new AppID, click here. Once you fill the form, submit it and you will get your AppID. Now, lets make this work in several steps. 1. Open VS.NET or Visual Web Developer.NET, create new sample project (or use existing one) and create new ASPX Web Form with name of your choice. 2. Add the following ASPX in your page body <body>     <form id="form1" runat="server">     <asp:TextBox ID="txtSearch" runat="server" /> <asp:Button ID="btnSearch" runat="server" Text="BING Search" />     <div id="result">          </div>     </form> </body> We have text box for search, button for firing the search event and div where we will place the results. 3. Next, I have created simple CSS style for the search result: <style type="text/css">             .item { width:600px; padding-top:10px; }             .title { background-color:#4196CE; color:White; font-size:18px;              font-family:Calibri, Verdana, Tahoma, Sans-Serif; padding:2px 2px 2px 2px; }     .title a { text-decoration:none; color:white}     .date { font-style:italic; font-size:10px; font-family:Verdana, Arial, Sans-Serif;}             .description { font-family:Verdana, Arial, Sans-Serif; padding:2px 2px 2px 2px; font-size:12px; }     .url { font-size: 10px; font-style:italic; font-weight:bold; color:Gray;}     .url a { text-decoration:none; color:gray;}     #txtSearch { width:450px; border:2px solid #4196CE; } </style> 4. The needed jQuery Scripts (v1.4.4 core jQuery and jQuery template plugin) <script src="http://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.4.4.min.js" type="text/javascript"></script> <script src="http://ajax.aspnetcdn.com/ajax/jquery.templates/beta1/jquery.tmpl.min.js" type="text/javascript"></script> Note: I use jQuery Templates plugin in order to avoid foreach loop in the jQuery callback function. JQuery Templates also simplifies the code and allows us to create nice template for the end result. You can read more about jQuery Templates here. 5. Now, lets create another script tag where we will write our BING search script <script language="javascript" type="text/javascript">     $(document).ready(function () {         var bingAPIKey = "<Your-BING-AppID-KEY-HERE>";                  //the rest of the script goes here              }); </script> 6. Before we do any searching, we need to take a look at the search URL that we will call from our Ajax function BING Search URL : http://api.search.live.net/json.aspx?JsonType=callback&JsonCallback=?&AppId={appId}&query={query}&sources={sourceType} The URL in our example is as follows: http://api.search.live.net/json.aspx?JsonType=callback&JsonCallback=?&Appid=" + bingAPIKey + "&query=" + keyWords + "&sources=web Lets split it up with brief explanation on each part of the URL http://api.search.live.net/json.aspx – is the main part of the URL which is used to call when we need to retrieve json result set. JsonType=callback&JsonCallback=? – using JsonType, we can control the format of the response. For more info about this, refer here. Appid=” + bingAPIKey +” – the AppID we’ve got from the BING website, explained previously query=” + keyWords + “ – the search query keywords sources=web – the type of source. Possible source types can be found here. 7. Before we continue with writing the last part of the script, lets see what search result BING will send us back: {"SearchResponse":     {         "Version":"2.2",         "Query":             {                 "SearchTerms":"hajan selmani aspnet weblog"             },         "Web":             {                 "Total":16,                 "Offset":0,                 "Results":[                     {                         "Title":"Hajan's Blog",                         "Description":"microsoft asp.net development blog ... Create nice animation on your ASP.NET Menu control using jQuery by hajan",                         "Url":"http:\/\/weblogs.asp.net\/hajan\/",                         "CacheUrl":"http:\/\/cc.bingj.com\/cache.aspx?q=hajan+selmani+aspnet+weblog&d=4760941354158132&w=c9535fb0,d1d66baa",                         "DisplayUrl":"weblogs.asp.net\/hajan",                         "DateTime":"2011-03-03T18:24:00Z"                     },                     {                         "Title":"codeasp.net",                         "Description":"... social community for ASP.NET bloggers - we are one of                                         the largest ASP.NET blog ... 2\/5\/2011 1:41:00 AM by Hajan Selmani - Comments ...",                         "Url":"http:\/\/codeasp.net\/blogs\/hajan",                         "CacheUrl":"http:\/\/cc.bingj.com\/cache.aspx?q=hajan+selmani+aspnet+weblog&d=4826710187311653&w=5b41c930,676a37f8",                         "DisplayUrl":"codeasp.net\/blogs\/hajan",                         "DateTime":"2011-03-03T07:40:00Z"                     }                     ...                         ]             }     } }  To get to the result of the search response, the path is: SearchResponse.Web.Results, where we have array of objects returned back from BING. 8. The final part of the code that performs the search is $("#<%= btnSearch.ClientID %>").click(function (event) {     event.preventDefault();     var keyWords = $("#<%= txtSearch.ClientID %>").val();     var encodedKeyWords = encodeURIComponent(keyWords);     //alert(keyWords);     var url = "http://api.search.live.net/json.aspx?JsonType=callback&JsonCallback=?&Appid="+ bingAPIKey              + "&query=" + encodedKeyWords              + "&sources=web";     $.getJSON(url, function (data) {         $("#result").html("");         $("#bingSearchTemplate").tmpl(data.SearchResponse.Web.Results).appendTo("#result");     }); }); The search happens once we click the Search Button with id btnSearch. We get the keywords from the Text Box with id txtSearch and then we use encodeURIComponent. The encodeURIComponent is used to encode the special characters such as: , / ? : @ & = + $ #, which might be part of the search query string. Then we construct the URL and call it using HTTP GET. The callback function returns the data, where we first clear the html inside div with id result and after that we render the data.SearchResponse.Web.Results array of objects using template with id bingSearchTemplate and append the result into div with id result. 9. The bingSearchTemplate Template <script id="bingSearchTemplate" type="text/html">     <div class="item">         <div class="title"><a href="${Url}" target="_blank">${Title}</a></div>         <div class="date">${DateTime}</div>         <div class="searchresult">             <div class="description">             ${Description}             </div>             <div class="url">                 <a href="${Url}" target="_blank">${Url}</a>             </div>         </div>     </div> </script> If you paid attention on the search result structure that BING creates for us, you have seen properties like Url, Title, Description, DateTime etc. In the above defined template, you see the same wrapped into template tags. Some are combined to create hyperlinked URLs. 10. THE END RESULT   As you see, it’s quite simple to use BING API and make search queries with ASP.NET and jQuery. In addition, if you want to make instant search, replace this line: $(“#<%= btnSearch.ClientID %>”).click(function(event) {        event.preventDefault(); with $(“#<%= txtSearch.ClientID %>”).keyup(function() { This will trigger search on each key up in your keyboard, so if you use this approach, you won’t event need a search button. If it’s your first time working with BING API, it’s very recommended to read the following API Basics PDF document. Hope this was helpful blog post for you.

    Read the article

  • Node.js Adventure - Storage Services and Service Runtime

    - by Shaun
    When I described on how to host a Node.js application on Windows Azure, one of questions might be raised about how to consume the vary Windows Azure services, such as the storage, service bus, access control, etc.. Interact with windows azure services is available in Node.js through the Windows Azure Node.js SDK, which is a module available in NPM. In this post I would like to describe on how to use Windows Azure Storage (a.k.a. WAS) as well as the service runtime.   Consume Windows Azure Storage Let’s firstly have a look on how to consume WAS through Node.js. As we know in the previous post we can host Node.js application on Windows Azure Web Site (a.k.a. WAWS) as well as Windows Azure Cloud Service (a.k.a. WACS). In theory, WAWS is also built on top of WACS worker roles with some more features. Hence in this post I will only demonstrate for hosting in WACS worker role. The Node.js code can be used when consuming WAS when hosted on WAWS. But since there’s no roles in WAWS, the code for consuming service runtime mentioned in the next section cannot be used for WAWS node application. We can use the solution that I created in my last post. Alternatively we can create a new windows azure project in Visual Studio with a worker role, add the “node.exe” and “index.js” and install “express” and “node-sqlserver” modules, make all files as “Copy always”. In order to use windows azure services we need to have Windows Azure Node.js SDK, as knows as a module named “azure” which can be installed through NPM. Once we downloaded and installed, we need to include them in our worker role project and make them as “Copy always”. You can use my “Copy all always” tool mentioned in my last post to update the currently worker role project file. You can also find the source code of this tool here. The source code of Windows Azure SDK for Node.js can be found in its GitHub page. It contains two parts. One is a CLI tool which provides a cross platform command line package for Mac and Linux to manage WAWS and Windows Azure Virtual Machines (a.k.a. WAVM). The other is a library for managing and consuming vary windows azure services includes tables, blobs, queues, service bus and the service runtime. I will not cover all of them but will only demonstrate on how to use tables and service runtime information in this post. You can find the full document of this SDK here. Back to Visual Studio and open the “index.js”, let’s continue our application from the last post, which was working against Windows Azure SQL Database (a.k.a. WASD). The code should looks like this. 1: var express = require("express"); 2: var sql = require("node-sqlserver"); 3:  4: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:ac6271ya9e.database.windows.net,1433;Database=synctile;Uid=shaunxu@ac6271ya9e;Pwd={PASSWORD};Encrypt=yes;Connection Timeout=30;"; 5: var port = 80; 6:  7: var app = express(); 8:  9: app.configure(function () { 10: app.use(express.bodyParser()); 11: }); 12:  13: app.get("/", function (req, res) { 14: sql.open(connectionString, function (err, conn) { 15: if (err) { 16: console.log(err); 17: res.send(500, "Cannot open connection."); 18: } 19: else { 20: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 21: if (err) { 22: console.log(err); 23: res.send(500, "Cannot retrieve records."); 24: } 25: else { 26: res.json(results); 27: } 28: }); 29: } 30: }); 31: }); 32:  33: app.get("/text/:key/:culture", function (req, res) { 34: sql.open(connectionString, function (err, conn) { 35: if (err) { 36: console.log(err); 37: res.send(500, "Cannot open connection."); 38: } 39: else { 40: var key = req.params.key; 41: var culture = req.params.culture; 42: var command = "SELECT * FROM [Resource] WHERE [Key] = '" + key + "' AND [Culture] = '" + culture + "'"; 43: conn.queryRaw(command, function (err, results) { 44: if (err) { 45: console.log(err); 46: res.send(500, "Cannot retrieve records."); 47: } 48: else { 49: res.json(results); 50: } 51: }); 52: } 53: }); 54: }); 55:  56: app.get("/sproc/:key/:culture", function (req, res) { 57: sql.open(connectionString, function (err, conn) { 58: if (err) { 59: console.log(err); 60: res.send(500, "Cannot open connection."); 61: } 62: else { 63: var key = req.params.key; 64: var culture = req.params.culture; 65: var command = "EXEC GetItem '" + key + "', '" + culture + "'"; 66: conn.queryRaw(command, function (err, results) { 67: if (err) { 68: console.log(err); 69: res.send(500, "Cannot retrieve records."); 70: } 71: else { 72: res.json(results); 73: } 74: }); 75: } 76: }); 77: }); 78:  79: app.post("/new", function (req, res) { 80: var key = req.body.key; 81: var culture = req.body.culture; 82: var val = req.body.val; 83:  84: sql.open(connectionString, function (err, conn) { 85: if (err) { 86: console.log(err); 87: res.send(500, "Cannot open connection."); 88: } 89: else { 90: var command = "INSERT INTO [Resource] VALUES ('" + key + "', '" + culture + "', N'" + val + "')"; 91: conn.queryRaw(command, function (err, results) { 92: if (err) { 93: console.log(err); 94: res.send(500, "Cannot retrieve records."); 95: } 96: else { 97: res.send(200, "Inserted Successful"); 98: } 99: }); 100: } 101: }); 102: }); 103:  104: app.listen(port); Now let’s create a new function, copy the records from WASD to table service. 1. Delete the table named “resource”. 2. Create a new table named “resource”. These 2 steps ensures that we have an empty table. 3. Load all records from the “resource” table in WASD. 4. For each records loaded from WASD, insert them into the table one by one. 5. Prompt to user when finished. In order to use table service we need the storage account and key, which can be found from the developer portal. Just select the storage account and click the Manage Keys button. Then create two local variants in our Node.js application for the storage account name and key. Since we need to use WAS we need to import the azure module. Also I created another variant stored the table name. In order to work with table service I need to create the storage client for table service. This is very similar as the Windows Azure SDK for .NET. As the code below I created a new variant named “client” and use “createTableService”, specified my storage account name and key. 1: var azure = require("azure"); 2: var storageAccountName = "synctile"; 3: var storageAccountKey = "/cOy9L7xysXOgPYU9FjDvjrRAhaMX/5tnOpcjqloPNDJYucbgTy7MOrAW7CbUg6PjaDdmyl+6pkwUnKETsPVNw=="; 4: var tableName = "resource"; 5: var client = azure.createTableService(storageAccountName, storageAccountKey); Now create a new function for URL “/was/init” so that we can trigger it through browser. Then in this function we will firstly load all records from WASD. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: } 18: } 19: }); 20: } 21: }); 22: }); When we succeed loaded all records we can start to transform them into table service. First I need to recreate the table in table service. This can be done by deleting and creating the table through table client I had just created previously. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: // transform the records 26: } 27: }); 28: }); 29: } 30: } 31: }); 32: } 33: }); 34: }); As you can see, the azure SDK provide its methods in callback pattern. In fact, almost all modules in Node.js use the callback pattern. For example, when I deleted a table I invoked “deleteTable” method, provided the name of the table and a callback function which will be performed when the table had been deleted or failed. Underlying, the azure module will perform the table deletion operation in POSIX async threads pool asynchronously. And once it’s done the callback function will be performed. This is the reason we need to nest the table creation code inside the deletion function. If we perform the table creation code after the deletion code then they will be invoked in parallel. Next, for each records in WASD I created an entity and then insert into the table service. Finally I send the response to the browser. Can you find a bug in the code below? I will describe it later in this post. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: // transform the records 26: for (var i = 0; i < results.rows.length; i++) { 27: var entity = { 28: "PartitionKey": results.rows[i][1], 29: "RowKey": results.rows[i][0], 30: "Value": results.rows[i][2] 31: }; 32: client.insertEntity(tableName, entity, function (error) { 33: if (error) { 34: error["target"] = "insertEntity"; 35: res.send(500, error); 36: } 37: else { 38: console.log("entity inserted"); 39: } 40: }); 41: } 42: // send the 43: console.log("all done"); 44: res.send(200, "All done!"); 45: } 46: }); 47: }); 48: } 49: } 50: }); 51: } 52: }); 53: }); Now we can publish it to the cloud and have a try. But normally we’d better test it at the local emulator first. In Node.js SDK there are three build-in properties which provides the account name, key and host address for local storage emulator. We can use them to initialize our table service client. We also need to change the SQL connection string to let it use my local database. The code will be changed as below. 1: // windows azure sql database 2: //var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:ac6271ya9e.database.windows.net,1433;Database=synctile;Uid=shaunxu@ac6271ya9e;Pwd=eszqu94XZY;Encrypt=yes;Connection Timeout=30;"; 3: // sql server 4: var connectionString = "Driver={SQL Server Native Client 11.0};Server={.};Database={Caspar};Trusted_Connection={Yes};"; 5:  6: var azure = require("azure"); 7: var storageAccountName = "synctile"; 8: var storageAccountKey = "/cOy9L7xysXOgPYU9FjDvjrRAhaMX/5tnOpcjqloPNDJYucbgTy7MOrAW7CbUg6PjaDdmyl+6pkwUnKETsPVNw=="; 9: var tableName = "resource"; 10: // windows azure storage 11: //var client = azure.createTableService(storageAccountName, storageAccountKey); 12: // local storage emulator 13: var client = azure.createTableService(azure.ServiceClient.DEVSTORE_STORAGE_ACCOUNT, azure.ServiceClient.DEVSTORE_STORAGE_ACCESS_KEY, azure.ServiceClient.DEVSTORE_TABLE_HOST); Now let’s run the application and navigate to “localhost:12345/was/init” as I hosted it on port 12345. We can find it transformed the data from my local database to local table service. Everything looks fine. But there is a bug in my code. If we have a look on the Node.js command window we will find that it sent response before all records had been inserted, which is not what I expected. The reason is that, as I mentioned before, Node.js perform all IO operations in non-blocking model. When we inserted the records we executed the table service insert method in parallel, and the operation of sending response was also executed in parallel, even though I wrote it at the end of my logic. The correct logic should be, when all entities had been copied to table service with no error, then I will send response to the browser, otherwise I should send error message to the browser. To do so I need to import another module named “async”, which helps us to coordinate our asynchronous code. Install the module and import it at the beginning of the code. Then we can use its “forEach” method for the asynchronous code of inserting table entities. The first argument of “forEach” is the array that will be performed. The second argument is the operation for each items in the array. And the third argument will be invoked then all items had been performed or any errors occurred. Here we can send our response to browser. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: async.forEach(results.rows, 26: // transform the records 27: function (row, callback) { 28: var entity = { 29: "PartitionKey": row[1], 30: "RowKey": row[0], 31: "Value": row[2] 32: }; 33: client.insertEntity(tableName, entity, function (error) { 34: if (error) { 35: callback(error); 36: } 37: else { 38: console.log("entity inserted."); 39: callback(null); 40: } 41: }); 42: }, 43: // send reponse 44: function (error) { 45: if (error) { 46: error["target"] = "insertEntity"; 47: res.send(500, error); 48: } 49: else { 50: console.log("all done"); 51: res.send(200, "All done!"); 52: } 53: } 54: ); 55: } 56: }); 57: }); 58: } 59: } 60: }); 61: } 62: }); 63: }); Run it locally and now we can find the response was sent after all entities had been inserted. Query entities against table service is simple as well. Just use the “queryEntity” method from the table service client and providing the partition key and row key. We can also provide a complex query criteria as well, for example the code here. In the code below I queried an entity by the partition key and row key, and return the proper localization value in response. 1: app.get("/was/:key/:culture", function (req, res) { 2: var key = req.params.key; 3: var culture = req.params.culture; 4: client.queryEntity(tableName, culture, key, function (error, entity) { 5: if (error) { 6: res.send(500, error); 7: } 8: else { 9: res.json(entity); 10: } 11: }); 12: }); And then tested it on local emulator. Finally if we want to publish this application to the cloud we should change the database connection string and storage account. For more information about how to consume blob and queue service, as well as the service bus please refer to the MSDN page.   Consume Service Runtime As I mentioned above, before we published our application to the cloud we need to change the connection string and account information in our code. But if you had played with WACS you should have known that the service runtime provides the ability to retrieve configuration settings, endpoints and local resource information at runtime. Which means we can have these values defined in CSCFG and CSDEF files and then the runtime should be able to retrieve the proper values. For example we can add some role settings though the property window of the role, specify the connection string and storage account for cloud and local. And the can also use the endpoint which defined in role environment to our Node.js application. In Node.js SDK we can get an object from “azure.RoleEnvironment”, which provides the functionalities to retrieve the configuration settings and endpoints, etc.. In the code below I defined the connection string variants and then use the SDK to retrieve and initialize the table client. 1: var connectionString = ""; 2: var storageAccountName = ""; 3: var storageAccountKey = ""; 4: var tableName = ""; 5: var client; 6:  7: azure.RoleEnvironment.getConfigurationSettings(function (error, settings) { 8: if (error) { 9: console.log("ERROR: getConfigurationSettings"); 10: console.log(JSON.stringify(error)); 11: } 12: else { 13: console.log(JSON.stringify(settings)); 14: connectionString = settings["SqlConnectionString"]; 15: storageAccountName = settings["StorageAccountName"]; 16: storageAccountKey = settings["StorageAccountKey"]; 17: tableName = settings["TableName"]; 18:  19: console.log("connectionString = %s", connectionString); 20: console.log("storageAccountName = %s", storageAccountName); 21: console.log("storageAccountKey = %s", storageAccountKey); 22: console.log("tableName = %s", tableName); 23:  24: client = azure.createTableService(storageAccountName, storageAccountKey); 25: } 26: }); In this way we don’t need to amend the code for the configurations between local and cloud environment since the service runtime will take care of it. At the end of the code we will listen the application on the port retrieved from SDK as well. 1: azure.RoleEnvironment.getCurrentRoleInstance(function (error, instance) { 2: if (error) { 3: console.log("ERROR: getCurrentRoleInstance"); 4: console.log(JSON.stringify(error)); 5: } 6: else { 7: console.log(JSON.stringify(instance)); 8: if (instance["endpoints"] && instance["endpoints"]["nodejs"]) { 9: var endpoint = instance["endpoints"]["nodejs"]; 10: app.listen(endpoint["port"]); 11: } 12: else { 13: app.listen(8080); 14: } 15: } 16: }); But if we tested the application right now we will find that it cannot retrieve any values from service runtime. This is because by default, the entry point of this role was defined to the worker role class. In windows azure environment the service runtime will open a named pipeline to the entry point instance, so that it can connect to the runtime and retrieve values. But in this case, since the entry point was worker role and the Node.js was opened inside the role, the named pipeline was established between our worker role class and service runtime, so our Node.js application cannot use it. To fix this problem we need to open the CSDEF file under the azure project, add a new element named Runtime. Then add an element named EntryPoint which specify the Node.js command line. So that the Node.js application will have the connection to service runtime, then it’s able to read the configurations. Start the Node.js at local emulator we can find it retrieved the connections, storage account for local. And if we publish our application to azure then it works with WASD and storage service through the configurations for cloud.   Summary In this post I demonstrated how to use Windows Azure SDK for Node.js to interact with storage service, especially the table service. I also demonstrated on how to use WACS service runtime, how to retrieve the configuration settings and the endpoint information. And in order to make the service runtime available to my Node.js application I need to create an entry point element in CSDEF file and set “node.exe” as the entry point. I used five posts to introduce and demonstrate on how to run a Node.js application on Windows platform, how to use Windows Azure Web Site and Windows Azure Cloud Service worker role to host our Node.js application. I also described how to work with other services provided by Windows Azure platform through Windows Azure SDK for Node.js. Node.js is a very new and young network application platform. But since it’s very simple and easy to learn and deploy, as well as, it utilizes single thread non-blocking IO model, Node.js became more and more popular on web application and web service development especially for those IO sensitive projects. And as Node.js is very good at scaling-out, it’s more useful on cloud computing platform. Use Node.js on Windows platform is new, too. The modules for SQL database and Windows Azure SDK are still under development and enhancement. It doesn’t support SQL parameter in “node-sqlserver”. It does support using storage connection string to create the storage client in “azure”. But Microsoft is working on make them easier to use, working on add more features and functionalities.   PS, you can download the source code here. You can download the source code of my “Copy all always” tool here.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • What does this mean: warning: converting from ‘void (ClassName::*)()’ to ‘void (*)()’

    - by Brendan Long
    I have a member function in a class that has a callback, but the callback isn't strictly neccessary, so it has a default callback, which is empty. It seems to work fine, but I get an annoying warning: warning: converting from ‘void (ClassName::*)()’ to ‘void (*)()’ I'm trying to figure out what it means and how to turn it off (or fix it if I really am doing something wrong). Here's some simple code: class ClassName{ public: void doSomething(void (*callbackFunction)() = (void(*)()) &ClassName::doNothing){ callbackFunction(); } void doNothing(){} }; int main(){ ClassName x; x.doSomething(); return 0; } Note: If I do this (without explicitly casting it as a void(*)()): void doSomething(void (*callbackFunction)() = &ClassName::doNothing) I get this: main.cpp:3: error: default argument for parameter of type ‘void (*)()’ has type ‘void (ClassName::*)()’

    Read the article

  • What can cause throughput to become really slow when an ISAPI filter implements SF_NOTIFY_SEND_RAW_D

    - by Gerald
    I have an ISAPI filter for IIS6 that I've been developing for a while, but I just noticed something disturbing. Anytime I have the filter installed and I download a file, the file download becomes really slow. From a remote machine I'm getting ~120kb per second without the filter installed, and ~45kb per second with the filter installed. This seems to be related to the SF_NOTIFY_SEND_RAW_DATA callback. Whenever I register for this callback, I get the slow downloads, when I don't register for it, everything is fine. Even if I make my HttpFilterProc function just return immediately, like this: DWORD WINAPI HttpFilterProc( PHTTP_FILTER_CONTEXT pfc, DWORD notificationType, LPVOID pvNotification ) { return SF_STATUS_REQ_NEXT_NOTIFICATION; } I've also tried returning SF_STATUS_REQ_HANDLED_NOTIFICATION with the same result. Is it possible that I have some build setting on my DLL that is causing a slow execution of the callback function, or is this just the way it's going to be with ISAPI?

    Read the article

  • Deeplinking using GWT History Token within a Facebook iFrame Canvas

    - by Stevko
    I would like to deep link directly to a GWT app page within a Facebook iFrame Canvas. The first part is simple using GWT's History token with URLs like: http://www.example.com/MyApp/#page1 which would open page1 within my app. Facebook Apps use an application url like: http://apps.facebook.com/myAppName which frames my Canvas Callback URL http://www.example.com/MyApp/ Is there a way to specify a canvas callback url (or bookmark url) which will take the user to a specific page rather than the index page? Why? you may ask. Besides all the benefits of deep links... I want the "Go To Application" url to take users to an index page w/ marketing material (the canvas callback url) I want the "Bookmark URL" to take (likely returning) users to a login page and bypass downloading the marketing content (and that huge SWF file).

    Read the article

  • Modifying listbox values with jQuery in WebForm not posting back

    - by Peter
    When hitting a button, an error would occur: System.Web.HttpUnhandledException: Exception of type 'System.Web.HttpUnhandledException' was thrown. --- System.ArgumentException: Invalid postback or callback argument. Event validation is enabled using in configuration or in a page. For security purposes, this feature verifies that arguments to postback or callback events originate from the server control that originally rendered them. If the data is valid and expected, use the ClientScriptManager.RegisterForEventValidation method in order to register the postback or callback data for validation. I then added EnableEventValidation="false" into my @Page directive, which fixed the error. Now after manipulating the listbox, the new values in the listbox are not posted back to the server. How can I solve this?

    Read the article

  • Throwing an exception from a BackgroundWorker which calls an async. method (Webrequest)

    - by mrbamboo
    Hi, My main application creates a new BackgroundWorker X the DoWork event handler of X calls a method Y of my controller. This method creates the WebRequest (async.) instance and the callback using AsyncCallback. When the response arrives the callback method Z gets called and the content will be analyzed. It can happen that the response has an unwanted content. At that moment callback Z will throw an exception. I want to catch this exception in my main application. I tried it in DoWork and RunWorkerCompleted but nothing can be caught from there. Error in RunWorkerCompletedEventArgs is always null.

    Read the article

  • Call javascript from objective-c, only works with system functions like alert() etc / phonegap

    - by adrian
    hi havent found the solution yet in the forum. i want to call a function i created in the index.html via a objective-c function. As explained in stringByEvaluatingJavaScriptFromString so the network detection doesnt work for me the function gets called but no javascript is called in the index.html here is the code i use - (void)updateReachability:(NSString*)callback { NSString* jsCallback = @"navigator.network.updateReachability"; if (callback) jsCallback = callback; NSString* status = [[NSString alloc] initWithFormat:@"%@({ hostName: '%@', ipAddress: '%@', remoteHostStatus: %d, internetConnectionStatus: %d, localWiFiConnectionStatus: %d });", jsCallback, [[Reachability sharedReachability] hostName], [[Reachability sharedReachability] address], [[Reachability sharedReachability] remoteHostStatus], [[Reachability sharedReachability] internetConnectionStatus], [[Reachability sharedReachability] localWiFiConnectionStatus]]; [webView stringByEvaluatingJavaScriptFromString:status]; [status release]; } If i log the status the string i see can be executed in the index.html file, the syntax is ok. but it doesnt get called if i do alert( etc... it gets called... need some help pls, because network detection is a kill criteria to publish a app!!! Adrian

    Read the article

  • Javascript Array Scope - newbie here

    - by Gianluca
    So, I am learning Javascript while playing white Google Calendar APIs and I just can't figure how this piece of code is working this way: var entriesResult = []; var data = new Date(2010,3,22,17,0,0); var callback = function(result) { var entries = result.feed.getEntries(); if (entries.length != 0) { entriesResult = eventsManager(entries, 0, data); window.alert("inner entriesResult " + entriesResult.length); } } this.service.getEventsFeed(this.query, callback, handleGDError); window.alert("outer entriesResult " + entriesResult.length); eventsManager() is a function that returns an array of Objects. getEventsFeed() it's an API function: it queries the service and pass a "feed root" (a feed with selected items) to the callback function. Why the first alert (inner..) outputs a valid entriesResult.length while the second one (outer..) always outputs a 0? I tought javascript arrays are always passed by reference, what's wrong whit my code? Thank you :)

    Read the article

  • Paypal subscription API - transaction variables not POSTed

    - by morpheous
    I am writing a payments system based around paypal, I am using the HTML 'API'. I am passing the following form fields to PayPal: 'rm' = 2 'return = http://www.example.com/payment-handler.php?token=sometoken Where 'token' is a token I generated. According to the paypal documentation, a return method (rm) of 2 indicates to Paypal that the transaction data be posted back to the callback url using the POST method. When processing items using 'buy_now' buttons, the transaction items are correctly POSTed to my callback url (payment-handler.php), but for 'subscribe' operations, although the callback url is called, no POST data is sent to the url, and also, the 'token' field is missing. Instead, there is a parameter called 'auth'. I cant see anything in the paypal docs about a 'auth' field - so I dont know whats generating it and if I can reliably using it. Can anyone shed some light on this?

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >